Jan 26 04:00:41 np0005595445 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 04:00:41 np0005595445 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 04:00:41 np0005595445 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 04:00:41 np0005595445 kernel: BIOS-provided physical RAM map:
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 04:00:41 np0005595445 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 26 04:00:41 np0005595445 kernel: NX (Execute Disable) protection: active
Jan 26 04:00:41 np0005595445 kernel: APIC: Static calls initialized
Jan 26 04:00:41 np0005595445 kernel: SMBIOS 2.8 present.
Jan 26 04:00:41 np0005595445 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 04:00:41 np0005595445 kernel: Hypervisor detected: KVM
Jan 26 04:00:41 np0005595445 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 04:00:41 np0005595445 kernel: kvm-clock: using sched offset of 3356242937 cycles
Jan 26 04:00:41 np0005595445 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 04:00:41 np0005595445 kernel: tsc: Detected 2799.998 MHz processor
Jan 26 04:00:41 np0005595445 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 26 04:00:41 np0005595445 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 04:00:41 np0005595445 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 04:00:41 np0005595445 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 04:00:41 np0005595445 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 04:00:41 np0005595445 kernel: Using GB pages for direct mapping
Jan 26 04:00:41 np0005595445 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 04:00:41 np0005595445 kernel: ACPI: Early table checksum verification disabled
Jan 26 04:00:41 np0005595445 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 04:00:41 np0005595445 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 04:00:41 np0005595445 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 04:00:41 np0005595445 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 04:00:41 np0005595445 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 04:00:41 np0005595445 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 04:00:41 np0005595445 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 04:00:41 np0005595445 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 04:00:41 np0005595445 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 04:00:41 np0005595445 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 04:00:41 np0005595445 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 04:00:41 np0005595445 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 04:00:41 np0005595445 kernel: No NUMA configuration found
Jan 26 04:00:41 np0005595445 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 26 04:00:41 np0005595445 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 26 04:00:41 np0005595445 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 04:00:41 np0005595445 kernel: Zone ranges:
Jan 26 04:00:41 np0005595445 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 04:00:41 np0005595445 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 04:00:41 np0005595445 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 04:00:41 np0005595445 kernel:  Device   empty
Jan 26 04:00:41 np0005595445 kernel: Movable zone start for each node
Jan 26 04:00:41 np0005595445 kernel: Early memory node ranges
Jan 26 04:00:41 np0005595445 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 04:00:41 np0005595445 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 04:00:41 np0005595445 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 04:00:41 np0005595445 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 26 04:00:41 np0005595445 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 04:00:41 np0005595445 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 04:00:41 np0005595445 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 04:00:41 np0005595445 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 04:00:41 np0005595445 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 04:00:41 np0005595445 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 04:00:41 np0005595445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 04:00:41 np0005595445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 04:00:41 np0005595445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 04:00:41 np0005595445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 04:00:41 np0005595445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 04:00:41 np0005595445 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 04:00:41 np0005595445 kernel: TSC deadline timer available
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Max. logical packages:   8
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Max. logical dies:       8
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Max. dies per package:   1
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Max. threads per core:   1
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Num. cores per package:     1
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Num. threads per package:   1
Jan 26 04:00:41 np0005595445 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 04:00:41 np0005595445 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 04:00:41 np0005595445 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 04:00:41 np0005595445 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 04:00:41 np0005595445 kernel: Booting paravirtualized kernel on KVM
Jan 26 04:00:41 np0005595445 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 04:00:41 np0005595445 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 04:00:41 np0005595445 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 04:00:41 np0005595445 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 04:00:41 np0005595445 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 04:00:41 np0005595445 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 04:00:41 np0005595445 kernel: random: crng init done
Jan 26 04:00:41 np0005595445 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: Fallback order for Node 0: 0 
Jan 26 04:00:41 np0005595445 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 26 04:00:41 np0005595445 kernel: Policy zone: Normal
Jan 26 04:00:41 np0005595445 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 04:00:41 np0005595445 kernel: software IO TLB: area num 8.
Jan 26 04:00:41 np0005595445 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 04:00:41 np0005595445 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 04:00:41 np0005595445 kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 04:00:41 np0005595445 kernel: Dynamic Preempt: voluntary
Jan 26 04:00:41 np0005595445 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 04:00:41 np0005595445 kernel: rcu: #011RCU event tracing is enabled.
Jan 26 04:00:41 np0005595445 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 04:00:41 np0005595445 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 26 04:00:41 np0005595445 kernel: #011Rude variant of Tasks RCU enabled.
Jan 26 04:00:41 np0005595445 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 26 04:00:41 np0005595445 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 04:00:41 np0005595445 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 04:00:41 np0005595445 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 04:00:41 np0005595445 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 04:00:41 np0005595445 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 04:00:41 np0005595445 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 04:00:41 np0005595445 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 04:00:41 np0005595445 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 04:00:41 np0005595445 kernel: Console: colour VGA+ 80x25
Jan 26 04:00:41 np0005595445 kernel: printk: console [ttyS0] enabled
Jan 26 04:00:41 np0005595445 kernel: ACPI: Core revision 20230331
Jan 26 04:00:41 np0005595445 kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 04:00:41 np0005595445 kernel: x2apic enabled
Jan 26 04:00:41 np0005595445 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 04:00:41 np0005595445 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 04:00:41 np0005595445 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 26 04:00:41 np0005595445 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 04:00:41 np0005595445 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 04:00:41 np0005595445 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 04:00:41 np0005595445 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 04:00:41 np0005595445 kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 04:00:41 np0005595445 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 04:00:41 np0005595445 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 04:00:41 np0005595445 kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 04:00:41 np0005595445 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 04:00:41 np0005595445 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 04:00:41 np0005595445 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 04:00:41 np0005595445 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 04:00:41 np0005595445 kernel: x86/bugs: return thunk changed
Jan 26 04:00:41 np0005595445 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 04:00:41 np0005595445 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 04:00:41 np0005595445 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 04:00:41 np0005595445 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 04:00:41 np0005595445 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 04:00:41 np0005595445 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 04:00:41 np0005595445 kernel: Freeing SMP alternatives memory: 40K
Jan 26 04:00:41 np0005595445 kernel: pid_max: default: 32768 minimum: 301
Jan 26 04:00:41 np0005595445 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 04:00:41 np0005595445 kernel: landlock: Up and running.
Jan 26 04:00:41 np0005595445 kernel: Yama: becoming mindful.
Jan 26 04:00:41 np0005595445 kernel: SELinux:  Initializing.
Jan 26 04:00:41 np0005595445 kernel: LSM support for eBPF active
Jan 26 04:00:41 np0005595445 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 04:00:41 np0005595445 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 04:00:41 np0005595445 kernel: ... version:                0
Jan 26 04:00:41 np0005595445 kernel: ... bit width:              48
Jan 26 04:00:41 np0005595445 kernel: ... generic registers:      6
Jan 26 04:00:41 np0005595445 kernel: ... value mask:             0000ffffffffffff
Jan 26 04:00:41 np0005595445 kernel: ... max period:             00007fffffffffff
Jan 26 04:00:41 np0005595445 kernel: ... fixed-purpose events:   0
Jan 26 04:00:41 np0005595445 kernel: ... event mask:             000000000000003f
Jan 26 04:00:41 np0005595445 kernel: signal: max sigframe size: 1776
Jan 26 04:00:41 np0005595445 kernel: rcu: Hierarchical SRCU implementation.
Jan 26 04:00:41 np0005595445 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 26 04:00:41 np0005595445 kernel: smp: Bringing up secondary CPUs ...
Jan 26 04:00:41 np0005595445 kernel: smpboot: x86: Booting SMP configuration:
Jan 26 04:00:41 np0005595445 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 04:00:41 np0005595445 kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 04:00:41 np0005595445 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 26 04:00:41 np0005595445 kernel: node 0 deferred pages initialised in 10ms
Jan 26 04:00:41 np0005595445 kernel: Memory: 7763956K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618368K reserved, 0K cma-reserved)
Jan 26 04:00:41 np0005595445 kernel: devtmpfs: initialized
Jan 26 04:00:41 np0005595445 kernel: x86/mm: Memory block size: 128MB
Jan 26 04:00:41 np0005595445 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 04:00:41 np0005595445 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 04:00:41 np0005595445 kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 04:00:41 np0005595445 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 26 04:00:41 np0005595445 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 04:00:41 np0005595445 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 04:00:41 np0005595445 kernel: audit: initializing netlink subsys (disabled)
Jan 26 04:00:41 np0005595445 kernel: audit: type=2000 audit(1769418039.770:1): state=initialized audit_enabled=0 res=1
Jan 26 04:00:41 np0005595445 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 04:00:41 np0005595445 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 04:00:41 np0005595445 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 04:00:41 np0005595445 kernel: cpuidle: using governor menu
Jan 26 04:00:41 np0005595445 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 04:00:41 np0005595445 kernel: PCI: Using configuration type 1 for base access
Jan 26 04:00:41 np0005595445 kernel: PCI: Using configuration type 1 for extended access
Jan 26 04:00:41 np0005595445 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 04:00:41 np0005595445 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 04:00:41 np0005595445 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 04:00:41 np0005595445 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 04:00:41 np0005595445 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 04:00:41 np0005595445 kernel: Demotion targets for Node 0: null
Jan 26 04:00:41 np0005595445 kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 04:00:41 np0005595445 kernel: ACPI: Added _OSI(Module Device)
Jan 26 04:00:41 np0005595445 kernel: ACPI: Added _OSI(Processor Device)
Jan 26 04:00:41 np0005595445 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 04:00:41 np0005595445 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 04:00:41 np0005595445 kernel: ACPI: Interpreter enabled
Jan 26 04:00:41 np0005595445 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 04:00:41 np0005595445 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 04:00:41 np0005595445 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 04:00:41 np0005595445 kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 04:00:41 np0005595445 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 04:00:41 np0005595445 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [3] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [4] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [5] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [6] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [7] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [8] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [9] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [10] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [11] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [12] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [13] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [14] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [15] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [16] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [17] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [18] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [19] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [20] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [21] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [22] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [23] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [24] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [25] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [26] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [27] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [28] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [29] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [30] registered
Jan 26 04:00:41 np0005595445 kernel: acpiphp: Slot [31] registered
Jan 26 04:00:41 np0005595445 kernel: PCI host bridge to bus 0000:00
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 04:00:41 np0005595445 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 04:00:41 np0005595445 kernel: iommu: Default domain type: Translated
Jan 26 04:00:41 np0005595445 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 04:00:41 np0005595445 kernel: SCSI subsystem initialized
Jan 26 04:00:41 np0005595445 kernel: ACPI: bus type USB registered
Jan 26 04:00:41 np0005595445 kernel: usbcore: registered new interface driver usbfs
Jan 26 04:00:41 np0005595445 kernel: usbcore: registered new interface driver hub
Jan 26 04:00:41 np0005595445 kernel: usbcore: registered new device driver usb
Jan 26 04:00:41 np0005595445 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 04:00:41 np0005595445 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 04:00:41 np0005595445 kernel: PTP clock support registered
Jan 26 04:00:41 np0005595445 kernel: EDAC MC: Ver: 3.0.0
Jan 26 04:00:41 np0005595445 kernel: NetLabel: Initializing
Jan 26 04:00:41 np0005595445 kernel: NetLabel:  domain hash size = 128
Jan 26 04:00:41 np0005595445 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 04:00:41 np0005595445 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 04:00:41 np0005595445 kernel: PCI: Using ACPI for IRQ routing
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 04:00:41 np0005595445 kernel: vgaarb: loaded
Jan 26 04:00:41 np0005595445 kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 04:00:41 np0005595445 kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 04:00:41 np0005595445 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 04:00:41 np0005595445 kernel: pnp: PnP ACPI init
Jan 26 04:00:41 np0005595445 kernel: pnp: PnP ACPI: found 5 devices
Jan 26 04:00:41 np0005595445 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_INET protocol family
Jan 26 04:00:41 np0005595445 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 26 04:00:41 np0005595445 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_XDP protocol family
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 04:00:41 np0005595445 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 04:00:41 np0005595445 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 04:00:41 np0005595445 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 70794 usecs
Jan 26 04:00:41 np0005595445 kernel: PCI: CLS 0 bytes, default 64
Jan 26 04:00:41 np0005595445 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 04:00:41 np0005595445 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 04:00:41 np0005595445 kernel: ACPI: bus type thunderbolt registered
Jan 26 04:00:41 np0005595445 kernel: Trying to unpack rootfs image as initramfs...
Jan 26 04:00:41 np0005595445 kernel: Initialise system trusted keyrings
Jan 26 04:00:41 np0005595445 kernel: Key type blacklist registered
Jan 26 04:00:41 np0005595445 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 26 04:00:41 np0005595445 kernel: zbud: loaded
Jan 26 04:00:41 np0005595445 kernel: integrity: Platform Keyring initialized
Jan 26 04:00:41 np0005595445 kernel: integrity: Machine keyring initialized
Jan 26 04:00:41 np0005595445 kernel: Freeing initrd memory: 87956K
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_ALG protocol family
Jan 26 04:00:41 np0005595445 kernel: xor: automatically using best checksumming function   avx       
Jan 26 04:00:41 np0005595445 kernel: Key type asymmetric registered
Jan 26 04:00:41 np0005595445 kernel: Asymmetric key parser 'x509' registered
Jan 26 04:00:41 np0005595445 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 04:00:41 np0005595445 kernel: io scheduler mq-deadline registered
Jan 26 04:00:41 np0005595445 kernel: io scheduler kyber registered
Jan 26 04:00:41 np0005595445 kernel: io scheduler bfq registered
Jan 26 04:00:41 np0005595445 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 04:00:41 np0005595445 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 04:00:41 np0005595445 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 04:00:41 np0005595445 kernel: ACPI: button: Power Button [PWRF]
Jan 26 04:00:41 np0005595445 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 04:00:41 np0005595445 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 04:00:41 np0005595445 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 04:00:41 np0005595445 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 04:00:41 np0005595445 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 04:00:41 np0005595445 kernel: Non-volatile memory driver v1.3
Jan 26 04:00:41 np0005595445 kernel: rdac: device handler registered
Jan 26 04:00:41 np0005595445 kernel: hp_sw: device handler registered
Jan 26 04:00:41 np0005595445 kernel: emc: device handler registered
Jan 26 04:00:41 np0005595445 kernel: alua: device handler registered
Jan 26 04:00:41 np0005595445 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 04:00:41 np0005595445 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 04:00:41 np0005595445 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 04:00:41 np0005595445 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 04:00:41 np0005595445 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 04:00:41 np0005595445 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 04:00:41 np0005595445 kernel: usb usb1: Product: UHCI Host Controller
Jan 26 04:00:41 np0005595445 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 04:00:41 np0005595445 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 04:00:41 np0005595445 kernel: hub 1-0:1.0: USB hub found
Jan 26 04:00:41 np0005595445 kernel: hub 1-0:1.0: 2 ports detected
Jan 26 04:00:41 np0005595445 kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 04:00:41 np0005595445 kernel: usbserial: USB Serial support registered for generic
Jan 26 04:00:41 np0005595445 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 04:00:41 np0005595445 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 04:00:41 np0005595445 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 04:00:41 np0005595445 kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 04:00:41 np0005595445 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 04:00:41 np0005595445 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 04:00:41 np0005595445 kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 04:00:41 np0005595445 kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T09:00:40 UTC (1769418040)
Jan 26 04:00:41 np0005595445 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 04:00:41 np0005595445 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 04:00:41 np0005595445 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 04:00:41 np0005595445 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 04:00:41 np0005595445 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 04:00:41 np0005595445 kernel: usbcore: registered new interface driver usbhid
Jan 26 04:00:41 np0005595445 kernel: usbhid: USB HID core driver
Jan 26 04:00:41 np0005595445 kernel: drop_monitor: Initializing network drop monitor service
Jan 26 04:00:41 np0005595445 kernel: Initializing XFRM netlink socket
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_INET6 protocol family
Jan 26 04:00:41 np0005595445 kernel: Segment Routing with IPv6
Jan 26 04:00:41 np0005595445 kernel: NET: Registered PF_PACKET protocol family
Jan 26 04:00:41 np0005595445 kernel: mpls_gso: MPLS GSO support
Jan 26 04:00:41 np0005595445 kernel: IPI shorthand broadcast: enabled
Jan 26 04:00:41 np0005595445 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 04:00:41 np0005595445 kernel: AES CTR mode by8 optimization enabled
Jan 26 04:00:41 np0005595445 kernel: sched_clock: Marking stable (1190004076, 156305756)->(1459592791, -113282959)
Jan 26 04:00:41 np0005595445 kernel: registered taskstats version 1
Jan 26 04:00:41 np0005595445 kernel: Loading compiled-in X.509 certificates
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 04:00:41 np0005595445 kernel: Demotion targets for Node 0: null
Jan 26 04:00:41 np0005595445 kernel: page_owner is disabled
Jan 26 04:00:41 np0005595445 kernel: Key type .fscrypt registered
Jan 26 04:00:41 np0005595445 kernel: Key type fscrypt-provisioning registered
Jan 26 04:00:41 np0005595445 kernel: Key type big_key registered
Jan 26 04:00:41 np0005595445 kernel: Key type encrypted registered
Jan 26 04:00:41 np0005595445 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 04:00:41 np0005595445 kernel: Loading compiled-in module X.509 certificates
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 04:00:41 np0005595445 kernel: ima: Allocated hash algorithm: sha256
Jan 26 04:00:41 np0005595445 kernel: ima: No architecture policies found
Jan 26 04:00:41 np0005595445 kernel: evm: Initialising EVM extended attributes:
Jan 26 04:00:41 np0005595445 kernel: evm: security.selinux
Jan 26 04:00:41 np0005595445 kernel: evm: security.SMACK64 (disabled)
Jan 26 04:00:41 np0005595445 kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 04:00:41 np0005595445 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 04:00:41 np0005595445 kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 04:00:41 np0005595445 kernel: evm: security.apparmor (disabled)
Jan 26 04:00:41 np0005595445 kernel: evm: security.ima
Jan 26 04:00:41 np0005595445 kernel: evm: security.capability
Jan 26 04:00:41 np0005595445 kernel: evm: HMAC attrs: 0x1
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 04:00:41 np0005595445 kernel: Running certificate verification RSA selftest
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 04:00:41 np0005595445 kernel: Running certificate verification ECDSA selftest
Jan 26 04:00:41 np0005595445 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 04:00:41 np0005595445 kernel: clk: Disabling unused clocks
Jan 26 04:00:41 np0005595445 kernel: Freeing unused decrypted memory: 2028K
Jan 26 04:00:41 np0005595445 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 04:00:41 np0005595445 kernel: Write protecting the kernel read-only data: 30720k
Jan 26 04:00:41 np0005595445 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: Manufacturer: QEMU
Jan 26 04:00:41 np0005595445 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 04:00:41 np0005595445 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 04:00:41 np0005595445 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 04:00:41 np0005595445 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 04:00:41 np0005595445 kernel: Run /init as init process
Jan 26 04:00:41 np0005595445 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 04:00:41 np0005595445 systemd: Detected virtualization kvm.
Jan 26 04:00:41 np0005595445 systemd: Detected architecture x86-64.
Jan 26 04:00:41 np0005595445 systemd: Running in initrd.
Jan 26 04:00:41 np0005595445 systemd: No hostname configured, using default hostname.
Jan 26 04:00:41 np0005595445 systemd: Hostname set to <localhost>.
Jan 26 04:00:41 np0005595445 systemd: Initializing machine ID from VM UUID.
Jan 26 04:00:41 np0005595445 systemd: Queued start job for default target Initrd Default Target.
Jan 26 04:00:41 np0005595445 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 04:00:41 np0005595445 systemd: Reached target Local Encrypted Volumes.
Jan 26 04:00:41 np0005595445 systemd: Reached target Initrd /usr File System.
Jan 26 04:00:41 np0005595445 systemd: Reached target Local File Systems.
Jan 26 04:00:41 np0005595445 systemd: Reached target Path Units.
Jan 26 04:00:41 np0005595445 systemd: Reached target Slice Units.
Jan 26 04:00:41 np0005595445 systemd: Reached target Swaps.
Jan 26 04:00:41 np0005595445 systemd: Reached target Timer Units.
Jan 26 04:00:41 np0005595445 systemd: Listening on D-Bus System Message Bus Socket.
Jan 26 04:00:41 np0005595445 systemd: Listening on Journal Socket (/dev/log).
Jan 26 04:00:41 np0005595445 systemd: Listening on Journal Socket.
Jan 26 04:00:41 np0005595445 systemd: Listening on udev Control Socket.
Jan 26 04:00:41 np0005595445 systemd: Listening on udev Kernel Socket.
Jan 26 04:00:41 np0005595445 systemd: Reached target Socket Units.
Jan 26 04:00:41 np0005595445 systemd: Starting Create List of Static Device Nodes...
Jan 26 04:00:41 np0005595445 systemd: Starting Journal Service...
Jan 26 04:00:41 np0005595445 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 04:00:41 np0005595445 systemd: Starting Apply Kernel Variables...
Jan 26 04:00:41 np0005595445 systemd: Starting Create System Users...
Jan 26 04:00:41 np0005595445 systemd: Starting Setup Virtual Console...
Jan 26 04:00:41 np0005595445 systemd: Finished Create List of Static Device Nodes.
Jan 26 04:00:41 np0005595445 systemd: Finished Apply Kernel Variables.
Jan 26 04:00:41 np0005595445 systemd: Finished Create System Users.
Jan 26 04:00:41 np0005595445 systemd-journald[303]: Journal started
Jan 26 04:00:41 np0005595445 systemd-journald[303]: Runtime Journal (/run/log/journal/0657a708098a4137a4d88ea25323424c) is 8.0M, max 153.6M, 145.6M free.
Jan 26 04:00:41 np0005595445 systemd-sysusers[308]: Creating group 'users' with GID 100.
Jan 26 04:00:41 np0005595445 systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Jan 26 04:00:41 np0005595445 systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 04:00:41 np0005595445 systemd: Started Journal Service.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 04:00:41 np0005595445 systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 04:00:41 np0005595445 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 04:00:41 np0005595445 systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 04:00:41 np0005595445 systemd[1]: Finished Setup Virtual Console.
Jan 26 04:00:41 np0005595445 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting dracut cmdline hook...
Jan 26 04:00:41 np0005595445 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 04:00:41 np0005595445 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 04:00:41 np0005595445 systemd[1]: Finished dracut cmdline hook.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting dracut pre-udev hook...
Jan 26 04:00:41 np0005595445 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 04:00:41 np0005595445 kernel: device-mapper: uevent: version 1.0.3
Jan 26 04:00:41 np0005595445 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 04:00:41 np0005595445 kernel: RPC: Registered named UNIX socket transport module.
Jan 26 04:00:41 np0005595445 kernel: RPC: Registered udp transport module.
Jan 26 04:00:41 np0005595445 kernel: RPC: Registered tcp transport module.
Jan 26 04:00:41 np0005595445 kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 04:00:41 np0005595445 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 04:00:41 np0005595445 rpc.statd[443]: Version 2.5.4 starting
Jan 26 04:00:41 np0005595445 rpc.statd[443]: Initializing NSM state
Jan 26 04:00:41 np0005595445 rpc.idmapd[448]: Setting log level to 0
Jan 26 04:00:41 np0005595445 systemd[1]: Finished dracut pre-udev hook.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 04:00:41 np0005595445 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 04:00:41 np0005595445 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting dracut pre-trigger hook...
Jan 26 04:00:41 np0005595445 systemd[1]: Finished dracut pre-trigger hook.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting Coldplug All udev Devices...
Jan 26 04:00:41 np0005595445 systemd[1]: Created slice Slice /system/modprobe.
Jan 26 04:00:41 np0005595445 systemd[1]: Starting Load Kernel Module configfs...
Jan 26 04:00:41 np0005595445 systemd[1]: Finished Coldplug All udev Devices.
Jan 26 04:00:41 np0005595445 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 04:00:41 np0005595445 systemd[1]: Finished Load Kernel Module configfs.
Jan 26 04:00:41 np0005595445 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 04:00:41 np0005595445 systemd[1]: Reached target Network.
Jan 26 04:00:41 np0005595445 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 04:00:41 np0005595445 systemd[1]: Starting dracut initqueue hook...
Jan 26 04:00:42 np0005595445 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 04:00:42 np0005595445 kernel: scsi host0: ata_piix
Jan 26 04:00:42 np0005595445 kernel: scsi host1: ata_piix
Jan 26 04:00:42 np0005595445 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 04:00:42 np0005595445 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 04:00:42 np0005595445 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 26 04:00:42 np0005595445 kernel: vda: vda1
Jan 26 04:00:42 np0005595445 systemd[1]: Mounting Kernel Configuration File System...
Jan 26 04:00:42 np0005595445 systemd[1]: Mounted Kernel Configuration File System.
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target System Initialization.
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target Basic System.
Jan 26 04:00:42 np0005595445 kernel: ata1: found unknown device (class 0)
Jan 26 04:00:42 np0005595445 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 04:00:42 np0005595445 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 04:00:42 np0005595445 systemd-udevd[464]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:00:42 np0005595445 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 04:00:42 np0005595445 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target Initrd Root Device.
Jan 26 04:00:42 np0005595445 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 04:00:42 np0005595445 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 04:00:42 np0005595445 systemd[1]: Finished dracut initqueue hook.
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 04:00:42 np0005595445 systemd[1]: Reached target Remote File Systems.
Jan 26 04:00:42 np0005595445 systemd[1]: Starting dracut pre-mount hook...
Jan 26 04:00:42 np0005595445 systemd[1]: Finished dracut pre-mount hook.
Jan 26 04:00:42 np0005595445 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 04:00:42 np0005595445 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 04:00:42 np0005595445 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 04:00:42 np0005595445 systemd[1]: Mounting /sysroot...
Jan 26 04:00:42 np0005595445 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 04:00:42 np0005595445 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 04:00:42 np0005595445 kernel: XFS (vda1): Ending clean mount
Jan 26 04:00:43 np0005595445 systemd[1]: Mounted /sysroot.
Jan 26 04:00:43 np0005595445 systemd[1]: Reached target Initrd Root File System.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 04:00:43 np0005595445 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 04:00:43 np0005595445 systemd[1]: Reached target Initrd File Systems.
Jan 26 04:00:43 np0005595445 systemd[1]: Reached target Initrd Default Target.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting dracut mount hook...
Jan 26 04:00:43 np0005595445 systemd[1]: Finished dracut mount hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 04:00:43 np0005595445 rpc.idmapd[448]: exiting on signal 15
Jan 26 04:00:43 np0005595445 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Network.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Timer Units.
Jan 26 04:00:43 np0005595445 systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Initrd Default Target.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Basic System.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Initrd Root Device.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Initrd /usr File System.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Path Units.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Remote File Systems.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Slice Units.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Socket Units.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target System Initialization.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Local File Systems.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Swaps.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut mount hook.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut pre-mount hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut initqueue hook.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Setup Virtual Console.
Jan 26 04:00:43 np0005595445 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Closed udev Control Socket.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Closed udev Kernel Socket.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut pre-udev hook.
Jan 26 04:00:43 np0005595445 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped dracut cmdline hook.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting Cleanup udev Database...
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 04:00:43 np0005595445 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 04:00:43 np0005595445 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Stopped Create System Users.
Jan 26 04:00:43 np0005595445 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 04:00:43 np0005595445 systemd[1]: Finished Cleanup udev Database.
Jan 26 04:00:43 np0005595445 systemd[1]: Reached target Switch Root.
Jan 26 04:00:43 np0005595445 systemd[1]: Starting Switch Root...
Jan 26 04:00:43 np0005595445 systemd[1]: Switching root.
Jan 26 04:00:43 np0005595445 systemd-journald[303]: Journal stopped
Jan 26 04:00:44 np0005595445 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 26 04:00:44 np0005595445 kernel: audit: type=1404 audit(1769418043.412:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:00:44 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:00:44 np0005595445 kernel: audit: type=1403 audit(1769418043.554:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 04:00:44 np0005595445 systemd: Successfully loaded SELinux policy in 145.312ms.
Jan 26 04:00:44 np0005595445 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.394ms.
Jan 26 04:00:44 np0005595445 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 04:00:44 np0005595445 systemd: Detected virtualization kvm.
Jan 26 04:00:44 np0005595445 systemd: Detected architecture x86-64.
Jan 26 04:00:44 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:00:44 np0005595445 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd: Stopped Switch Root.
Jan 26 04:00:44 np0005595445 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 04:00:44 np0005595445 systemd: Created slice Slice /system/getty.
Jan 26 04:00:44 np0005595445 systemd: Created slice Slice /system/serial-getty.
Jan 26 04:00:44 np0005595445 systemd: Created slice Slice /system/sshd-keygen.
Jan 26 04:00:44 np0005595445 systemd: Created slice User and Session Slice.
Jan 26 04:00:44 np0005595445 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 04:00:44 np0005595445 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 26 04:00:44 np0005595445 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 04:00:44 np0005595445 systemd: Reached target Local Encrypted Volumes.
Jan 26 04:00:44 np0005595445 systemd: Stopped target Switch Root.
Jan 26 04:00:44 np0005595445 systemd: Stopped target Initrd File Systems.
Jan 26 04:00:44 np0005595445 systemd: Stopped target Initrd Root File System.
Jan 26 04:00:44 np0005595445 systemd: Reached target Local Integrity Protected Volumes.
Jan 26 04:00:44 np0005595445 systemd: Reached target Path Units.
Jan 26 04:00:44 np0005595445 systemd: Reached target rpc_pipefs.target.
Jan 26 04:00:44 np0005595445 systemd: Reached target Slice Units.
Jan 26 04:00:44 np0005595445 systemd: Reached target Swaps.
Jan 26 04:00:44 np0005595445 systemd: Reached target Local Verity Protected Volumes.
Jan 26 04:00:44 np0005595445 systemd: Listening on RPCbind Server Activation Socket.
Jan 26 04:00:44 np0005595445 systemd: Reached target RPC Port Mapper.
Jan 26 04:00:44 np0005595445 systemd: Listening on Process Core Dump Socket.
Jan 26 04:00:44 np0005595445 systemd: Listening on initctl Compatibility Named Pipe.
Jan 26 04:00:44 np0005595445 systemd: Listening on udev Control Socket.
Jan 26 04:00:44 np0005595445 systemd: Listening on udev Kernel Socket.
Jan 26 04:00:44 np0005595445 systemd: Mounting Huge Pages File System...
Jan 26 04:00:44 np0005595445 systemd: Mounting POSIX Message Queue File System...
Jan 26 04:00:44 np0005595445 systemd: Mounting Kernel Debug File System...
Jan 26 04:00:44 np0005595445 systemd: Mounting Kernel Trace File System...
Jan 26 04:00:44 np0005595445 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 04:00:44 np0005595445 systemd: Starting Create List of Static Device Nodes...
Jan 26 04:00:44 np0005595445 systemd: Starting Load Kernel Module configfs...
Jan 26 04:00:44 np0005595445 systemd: Starting Load Kernel Module drm...
Jan 26 04:00:44 np0005595445 systemd: Starting Load Kernel Module efi_pstore...
Jan 26 04:00:44 np0005595445 systemd: Starting Load Kernel Module fuse...
Jan 26 04:00:44 np0005595445 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 04:00:44 np0005595445 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd: Stopped File System Check on Root Device.
Jan 26 04:00:44 np0005595445 systemd: Stopped Journal Service.
Jan 26 04:00:44 np0005595445 systemd: Starting Journal Service...
Jan 26 04:00:44 np0005595445 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 04:00:44 np0005595445 systemd: Starting Generate network units from Kernel command line...
Jan 26 04:00:44 np0005595445 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 04:00:44 np0005595445 systemd: Starting Remount Root and Kernel File Systems...
Jan 26 04:00:44 np0005595445 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 04:00:44 np0005595445 systemd: Starting Apply Kernel Variables...
Jan 26 04:00:44 np0005595445 kernel: fuse: init (API version 7.37)
Jan 26 04:00:44 np0005595445 systemd: Starting Coldplug All udev Devices...
Jan 26 04:00:44 np0005595445 systemd-journald[677]: Journal started
Jan 26 04:00:44 np0005595445 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 04:00:44 np0005595445 systemd[1]: Queued start job for default target Multi-User System.
Jan 26 04:00:44 np0005595445 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd: Started Journal Service.
Jan 26 04:00:44 np0005595445 systemd[1]: Mounted Huge Pages File System.
Jan 26 04:00:44 np0005595445 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 04:00:44 np0005595445 systemd[1]: Mounted POSIX Message Queue File System.
Jan 26 04:00:44 np0005595445 systemd[1]: Mounted Kernel Debug File System.
Jan 26 04:00:44 np0005595445 systemd[1]: Mounted Kernel Trace File System.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 04:00:44 np0005595445 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Load Kernel Module configfs.
Jan 26 04:00:44 np0005595445 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 26 04:00:44 np0005595445 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Load Kernel Module fuse.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Apply Kernel Variables.
Jan 26 04:00:44 np0005595445 systemd[1]: Mounting FUSE Control File System...
Jan 26 04:00:44 np0005595445 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Rebuild Hardware Database...
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 04:00:44 np0005595445 kernel: ACPI: bus type drm_connector registered
Jan 26 04:00:44 np0005595445 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 04:00:44 np0005595445 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 04:00:44 np0005595445 systemd-journald[677]: Received client request to flush runtime journal.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Create System Users...
Jan 26 04:00:44 np0005595445 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Load Kernel Module drm.
Jan 26 04:00:44 np0005595445 systemd[1]: Mounted FUSE Control File System.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 04:00:44 np0005595445 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Create System Users.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Coldplug All udev Devices.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 04:00:44 np0005595445 systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 04:00:44 np0005595445 systemd[1]: Reached target Local File Systems.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 04:00:44 np0005595445 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 04:00:44 np0005595445 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 04:00:44 np0005595445 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 04:00:44 np0005595445 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 04:00:44 np0005595445 bootctl[694]: Couldn't find EFI system partition, skipping.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Security Auditing Service...
Jan 26 04:00:44 np0005595445 systemd[1]: Starting RPC Bind...
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 04:00:44 np0005595445 auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 04:00:44 np0005595445 auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 04:00:44 np0005595445 systemd[1]: Started RPC Bind.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 04:00:44 np0005595445 augenrules[705]: /sbin/augenrules: No change
Jan 26 04:00:44 np0005595445 augenrules[720]: No rules
Jan 26 04:00:44 np0005595445 augenrules[720]: enabled 1
Jan 26 04:00:44 np0005595445 augenrules[720]: failure 1
Jan 26 04:00:44 np0005595445 augenrules[720]: pid 700
Jan 26 04:00:44 np0005595445 augenrules[720]: rate_limit 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_limit 8192
Jan 26 04:00:44 np0005595445 augenrules[720]: lost 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog 3
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time 60000
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time_actual 0
Jan 26 04:00:44 np0005595445 augenrules[720]: enabled 1
Jan 26 04:00:44 np0005595445 augenrules[720]: failure 1
Jan 26 04:00:44 np0005595445 augenrules[720]: pid 700
Jan 26 04:00:44 np0005595445 augenrules[720]: rate_limit 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_limit 8192
Jan 26 04:00:44 np0005595445 augenrules[720]: lost 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time 60000
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time_actual 0
Jan 26 04:00:44 np0005595445 augenrules[720]: enabled 1
Jan 26 04:00:44 np0005595445 augenrules[720]: failure 1
Jan 26 04:00:44 np0005595445 augenrules[720]: pid 700
Jan 26 04:00:44 np0005595445 augenrules[720]: rate_limit 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_limit 8192
Jan 26 04:00:44 np0005595445 augenrules[720]: lost 0
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog 3
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time 60000
Jan 26 04:00:44 np0005595445 augenrules[720]: backlog_wait_time_actual 0
Jan 26 04:00:44 np0005595445 systemd[1]: Started Security Auditing Service.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Rebuild Hardware Database.
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 04:00:44 np0005595445 systemd[1]: Starting Update is Completed...
Jan 26 04:00:44 np0005595445 systemd[1]: Finished Update is Completed.
Jan 26 04:00:45 np0005595445 systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 04:00:45 np0005595445 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target System Initialization.
Jan 26 04:00:45 np0005595445 systemd[1]: Started dnf makecache --timer.
Jan 26 04:00:45 np0005595445 systemd[1]: Started Daily rotation of log files.
Jan 26 04:00:45 np0005595445 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target Timer Units.
Jan 26 04:00:45 np0005595445 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 04:00:45 np0005595445 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target Socket Units.
Jan 26 04:00:45 np0005595445 systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:00:45 np0005595445 systemd[1]: Starting D-Bus System Message Bus...
Jan 26 04:00:45 np0005595445 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 04:00:45 np0005595445 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 04:00:45 np0005595445 systemd[1]: Starting Load Kernel Module configfs...
Jan 26 04:00:45 np0005595445 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 04:00:45 np0005595445 systemd[1]: Finished Load Kernel Module configfs.
Jan 26 04:00:45 np0005595445 systemd[1]: Started D-Bus System Message Bus.
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target Basic System.
Jan 26 04:00:45 np0005595445 dbus-broker-lau[764]: Ready
Jan 26 04:00:45 np0005595445 systemd[1]: Starting NTP client/server...
Jan 26 04:00:45 np0005595445 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 04:00:45 np0005595445 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 04:00:45 np0005595445 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 04:00:45 np0005595445 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 04:00:45 np0005595445 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 04:00:45 np0005595445 systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 04:00:45 np0005595445 systemd[1]: Started irqbalance daemon.
Jan 26 04:00:45 np0005595445 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 04:00:45 np0005595445 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:00:45 np0005595445 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:00:45 np0005595445 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target sshd-keygen.target.
Jan 26 04:00:45 np0005595445 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 04:00:45 np0005595445 systemd[1]: Reached target User and Group Name Lookups.
Jan 26 04:00:45 np0005595445 systemd[1]: Starting User Login Management...
Jan 26 04:00:45 np0005595445 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 04:00:45 np0005595445 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 04:00:45 np0005595445 chronyd[791]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 04:00:45 np0005595445 chronyd[791]: Loaded 0 symmetric keys
Jan 26 04:00:45 np0005595445 chronyd[791]: Using right/UTC timezone to obtain leap second data
Jan 26 04:00:45 np0005595445 chronyd[791]: Loaded seccomp filter (level 2)
Jan 26 04:00:45 np0005595445 systemd[1]: Started NTP client/server.
Jan 26 04:00:45 np0005595445 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 04:00:45 np0005595445 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 04:00:45 np0005595445 systemd-logind[783]: New seat seat0.
Jan 26 04:00:45 np0005595445 systemd[1]: Started User Login Management.
Jan 26 04:00:45 np0005595445 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 04:00:45 np0005595445 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 04:00:45 np0005595445 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 04:00:45 np0005595445 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 04:00:45 np0005595445 kernel: Console: switching to colour dummy device 80x25
Jan 26 04:00:45 np0005595445 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 04:00:45 np0005595445 kernel: [drm] features: -context_init
Jan 26 04:00:45 np0005595445 kernel: [drm] number of scanouts: 1
Jan 26 04:00:45 np0005595445 kernel: [drm] number of cap sets: 0
Jan 26 04:00:45 np0005595445 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 04:00:45 np0005595445 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 04:00:45 np0005595445 kernel: Console: switching to colour frame buffer device 128x48
Jan 26 04:00:45 np0005595445 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 04:00:45 np0005595445 kernel: kvm_amd: TSC scaling supported
Jan 26 04:00:45 np0005595445 kernel: kvm_amd: Nested Virtualization enabled
Jan 26 04:00:45 np0005595445 kernel: kvm_amd: Nested Paging enabled
Jan 26 04:00:45 np0005595445 kernel: kvm_amd: LBR virtualization supported
Jan 26 04:00:45 np0005595445 iptables.init[776]: iptables: Applying firewall rules: [  OK  ]
Jan 26 04:00:45 np0005595445 systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 04:00:45 np0005595445 cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 09:00:45 +0000. Up 6.25 seconds.
Jan 26 04:00:45 np0005595445 systemd[1]: run-cloud\x2dinit-tmp-tmp75cc9w6z.mount: Deactivated successfully.
Jan 26 04:00:45 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 04:00:45 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 04:00:45 np0005595445 systemd-hostnamed[852]: Hostname set to <np0005595445.novalocal> (static)
Jan 26 04:00:46 np0005595445 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 04:00:46 np0005595445 systemd[1]: Reached target Preparation for Network.
Jan 26 04:00:46 np0005595445 systemd[1]: Starting Network Manager...
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1345] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1350] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1413] manager[0x5638e96c4000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1450] hostname: hostname: using hostnamed
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1451] hostname: static hostname changed from (none) to "np0005595445.novalocal"
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1456] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1564] manager[0x5638e96c4000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1565] manager[0x5638e96c4000]: rfkill: WWAN hardware radio set enabled
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1602] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1602] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1602] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1603] manager: Networking is enabled by state file
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1605] settings: Loaded settings plugin: keyfile (internal)
Jan 26 04:00:46 np0005595445 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1613] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1630] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1639] dhcp: init: Using DHCP client 'internal'
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1641] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1652] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1658] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1664] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1672] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1673] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1694] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1698] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1699] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1700] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1702] device (eth0): carrier: link connected
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1704] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1708] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1712] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1716] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1716] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1718] manager: NetworkManager state is now CONNECTING
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1719] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1723] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1726] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:00:46 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:00:46 np0005595445 systemd[1]: Started Network Manager.
Jan 26 04:00:46 np0005595445 systemd[1]: Reached target Network.
Jan 26 04:00:46 np0005595445 systemd[1]: Starting Network Manager Wait Online...
Jan 26 04:00:46 np0005595445 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 04:00:46 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1943] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1946] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 04:00:46 np0005595445 NetworkManager[856]: <info>  [1769418046.1951] device (lo): Activation: successful, device activated.
Jan 26 04:00:46 np0005595445 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 04:00:46 np0005595445 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 04:00:46 np0005595445 systemd[1]: Reached target NFS client services.
Jan 26 04:00:46 np0005595445 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 04:00:46 np0005595445 systemd[1]: Reached target Remote File Systems.
Jan 26 04:00:46 np0005595445 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7788] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7799] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7824] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7860] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7861] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7864] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7866] device (eth0): Activation: successful, device activated.
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7871] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 04:00:49 np0005595445 NetworkManager[856]: <info>  [1769418049.7872] manager: startup complete
Jan 26 04:00:49 np0005595445 systemd[1]: Finished Network Manager Wait Online.
Jan 26 04:00:49 np0005595445 systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 04:00:50 np0005595445 cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 09:00:50 +0000. Up 10.70 seconds.
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.217        | 255.255.255.0 | global | fa:16:3e:37:06:29 |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe37:629/64 |       .       |  link  | fa:16:3e:37:06:29 |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 04:00:50 np0005595445 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 04:00:51 np0005595445 cloud-init[920]: Generating public/private rsa key pair.
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key fingerprint is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: SHA256:91aVOgqkVUYNufwoffI9fg3P2WuF7eXKoMDF99BQgsU root@np0005595445.novalocal
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key's randomart image is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: +---[RSA 3072]----+
Jan 26 04:00:51 np0005595445 cloud-init[920]: |          .O* .  |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |          +.E+  .|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |         o. o  ..|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |        +. o o.. |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |       .S.= =ooo |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |       . +.*.*+ +|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |        o ..B o**|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |         . o + =O|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |          .   =++|
Jan 26 04:00:51 np0005595445 cloud-init[920]: +----[SHA256]-----+
Jan 26 04:00:51 np0005595445 cloud-init[920]: Generating public/private ecdsa key pair.
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key fingerprint is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: SHA256:nJVILKfidZN5/mS6tozdBjlcS6E3CzkyDWlAcgjZf+8 root@np0005595445.novalocal
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key's randomart image is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: +---[ECDSA 256]---+
Jan 26 04:00:51 np0005595445 cloud-init[920]: |  .+.o+o..       |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |  . oo..*. ..    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |     . =.=oo .   |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |    . +.OoB =    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |   . o oSO B +   |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |    .     B =    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |         . B     |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |         +E.o    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |        ..=+.    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: +----[SHA256]-----+
Jan 26 04:00:51 np0005595445 cloud-init[920]: Generating public/private ed25519 key pair.
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 04:00:51 np0005595445 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key fingerprint is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: SHA256:dQAzxXqKTk7xH9NGYVPEVgUNRMDrhIkCFBUqOSwhkz4 root@np0005595445.novalocal
Jan 26 04:00:51 np0005595445 cloud-init[920]: The key's randomart image is:
Jan 26 04:00:51 np0005595445 cloud-init[920]: +--[ED25519 256]--+
Jan 26 04:00:51 np0005595445 cloud-init[920]: |+. .ooo.++o..BB=+|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |o+ ...   o..= o .|
Jan 26 04:00:51 np0005595445 cloud-init[920]: |o = ..   o.+.=   |
Jan 26 04:00:51 np0005595445 cloud-init[920]: | E o  o o.+.+    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |  .    =So =     |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |      + o o +    |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |     =   . +     |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |      o   .      |
Jan 26 04:00:51 np0005595445 cloud-init[920]: |                 |
Jan 26 04:00:51 np0005595445 cloud-init[920]: +----[SHA256]-----+
Jan 26 04:00:51 np0005595445 systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 04:00:51 np0005595445 systemd[1]: Reached target Cloud-config availability.
Jan 26 04:00:51 np0005595445 systemd[1]: Reached target Network is Online.
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Crash recovery kernel arming...
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 04:00:51 np0005595445 systemd[1]: Starting System Logging Service...
Jan 26 04:00:51 np0005595445 sm-notify[1004]: Version 2.5.4 starting
Jan 26 04:00:51 np0005595445 systemd[1]: Starting OpenSSH server daemon...
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Permit User Sessions...
Jan 26 04:00:51 np0005595445 systemd[1]: Started Notify NFS peers of a restart.
Jan 26 04:00:51 np0005595445 systemd[1]: Finished Permit User Sessions.
Jan 26 04:00:51 np0005595445 systemd[1]: Started Command Scheduler.
Jan 26 04:00:51 np0005595445 systemd[1]: Started Getty on tty1.
Jan 26 04:00:51 np0005595445 systemd[1]: Started Serial Getty on ttyS0.
Jan 26 04:00:51 np0005595445 systemd[1]: Reached target Login Prompts.
Jan 26 04:00:51 np0005595445 systemd[1]: Started OpenSSH server daemon.
Jan 26 04:00:51 np0005595445 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Jan 26 04:00:51 np0005595445 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 04:00:51 np0005595445 systemd[1]: Started System Logging Service.
Jan 26 04:00:51 np0005595445 systemd[1]: Reached target Multi-User System.
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 04:00:51 np0005595445 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 04:00:51 np0005595445 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 04:00:51 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:00:51 np0005595445 kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Jan 26 04:00:51 np0005595445 kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 04:00:51 np0005595445 cloud-init[1132]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 09:00:51 +0000. Up 12.45 seconds.
Jan 26 04:00:51 np0005595445 systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 04:00:51 np0005595445 systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 04:00:52 np0005595445 dracut[1265]: dracut-057-102.git20250818.el9
Jan 26 04:00:52 np0005595445 cloud-init[1283]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 09:00:52 +0000. Up 12.84 seconds.
Jan 26 04:00:52 np0005595445 dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 04:00:52 np0005595445 cloud-init[1298]: #############################################################
Jan 26 04:00:52 np0005595445 cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 04:00:52 np0005595445 cloud-init[1307]: 256 SHA256:nJVILKfidZN5/mS6tozdBjlcS6E3CzkyDWlAcgjZf+8 root@np0005595445.novalocal (ECDSA)
Jan 26 04:00:52 np0005595445 cloud-init[1312]: 256 SHA256:dQAzxXqKTk7xH9NGYVPEVgUNRMDrhIkCFBUqOSwhkz4 root@np0005595445.novalocal (ED25519)
Jan 26 04:00:52 np0005595445 cloud-init[1319]: 3072 SHA256:91aVOgqkVUYNufwoffI9fg3P2WuF7eXKoMDF99BQgsU root@np0005595445.novalocal (RSA)
Jan 26 04:00:52 np0005595445 cloud-init[1320]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 04:00:52 np0005595445 cloud-init[1321]: #############################################################
Jan 26 04:00:52 np0005595445 cloud-init[1283]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 09:00:52 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.02 seconds
Jan 26 04:00:52 np0005595445 systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 04:00:52 np0005595445 systemd[1]: Reached target Cloud-init target.
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 04:00:52 np0005595445 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: memstrack is not available
Jan 26 04:00:53 np0005595445 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 04:00:53 np0005595445 dracut[1267]: memstrack is not available
Jan 26 04:00:53 np0005595445 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 04:00:53 np0005595445 dracut[1267]: *** Including module: systemd ***
Jan 26 04:00:53 np0005595445 dracut[1267]: *** Including module: fips ***
Jan 26 04:00:54 np0005595445 dracut[1267]: *** Including module: systemd-initrd ***
Jan 26 04:00:54 np0005595445 dracut[1267]: *** Including module: i18n ***
Jan 26 04:00:54 np0005595445 dracut[1267]: *** Including module: drm ***
Jan 26 04:00:54 np0005595445 dracut[1267]: *** Including module: prefixdevname ***
Jan 26 04:00:54 np0005595445 dracut[1267]: *** Including module: kernel-modules ***
Jan 26 04:00:54 np0005595445 kernel: block vda: the capability attribute has been deprecated.
Jan 26 04:00:54 np0005595445 chronyd[791]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Jan 26 04:00:54 np0005595445 chronyd[791]: System clock TAI offset set to 37 seconds
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: kernel-modules-extra ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: qemu ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: fstab-sys ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: rootfs-block ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: terminfo ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: udev-rules ***
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 35 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 33 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 31 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 28 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 34 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 32 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 30 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 irqbalance[778]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 04:00:55 np0005595445 irqbalance[778]: IRQ 29 affinity is now unmanaged
Jan 26 04:00:55 np0005595445 dracut[1267]: Skipping udev rule: 91-permissions.rules
Jan 26 04:00:55 np0005595445 dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: virtiofs ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: dracut-systemd ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: usrmount ***
Jan 26 04:00:55 np0005595445 dracut[1267]: *** Including module: base ***
Jan 26 04:00:56 np0005595445 dracut[1267]: *** Including module: fs-lib ***
Jan 26 04:00:56 np0005595445 dracut[1267]: *** Including module: kdumpbase ***
Jan 26 04:00:56 np0005595445 dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 04:00:56 np0005595445 dracut[1267]:  microcode_ctl module: mangling fw_dir
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 04:00:56 np0005595445 dracut[1267]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 04:00:56 np0005595445 dracut[1267]: *** Including module: openssl ***
Jan 26 04:00:56 np0005595445 dracut[1267]: *** Including module: shutdown ***
Jan 26 04:00:57 np0005595445 dracut[1267]: *** Including module: squash ***
Jan 26 04:00:57 np0005595445 dracut[1267]: *** Including modules done ***
Jan 26 04:00:57 np0005595445 dracut[1267]: *** Installing kernel module dependencies ***
Jan 26 04:00:57 np0005595445 dracut[1267]: *** Installing kernel module dependencies done ***
Jan 26 04:00:57 np0005595445 dracut[1267]: *** Resolving executable dependencies ***
Jan 26 04:00:59 np0005595445 dracut[1267]: *** Resolving executable dependencies done ***
Jan 26 04:00:59 np0005595445 dracut[1267]: *** Generating early-microcode cpio image ***
Jan 26 04:00:59 np0005595445 dracut[1267]: *** Store current command line parameters ***
Jan 26 04:00:59 np0005595445 dracut[1267]: Stored kernel commandline:
Jan 26 04:00:59 np0005595445 dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Jan 26 04:00:59 np0005595445 dracut[1267]: *** Install squash loader ***
Jan 26 04:00:59 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:01:00 np0005595445 dracut[1267]: *** Squashing the files inside the initramfs ***
Jan 26 04:01:01 np0005595445 dracut[1267]: *** Squashing the files inside the initramfs done ***
Jan 26 04:01:01 np0005595445 dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 04:01:01 np0005595445 dracut[1267]: *** Hardlinking files ***
Jan 26 04:01:01 np0005595445 dracut[1267]: *** Hardlinking files done ***
Jan 26 04:01:01 np0005595445 dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 04:01:02 np0005595445 kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Jan 26 04:01:02 np0005595445 kdumpctl[1014]: kdump: Starting kdump: [OK]
Jan 26 04:01:02 np0005595445 systemd[1]: Finished Crash recovery kernel arming.
Jan 26 04:01:02 np0005595445 systemd[1]: Startup finished in 1.551s (kernel) + 2.488s (initrd) + 18.743s (userspace) = 22.783s.
Jan 26 04:01:12 np0005595445 systemd[1]: Created slice User Slice of UID 1000.
Jan 26 04:01:12 np0005595445 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 04:01:12 np0005595445 systemd-logind[783]: New session 1 of user zuul.
Jan 26 04:01:12 np0005595445 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 04:01:12 np0005595445 systemd[1]: Starting User Manager for UID 1000...
Jan 26 04:01:12 np0005595445 systemd[4320]: Queued start job for default target Main User Target.
Jan 26 04:01:12 np0005595445 systemd[4320]: Created slice User Application Slice.
Jan 26 04:01:12 np0005595445 systemd[4320]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 04:01:12 np0005595445 systemd[4320]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 04:01:12 np0005595445 systemd[4320]: Reached target Paths.
Jan 26 04:01:12 np0005595445 systemd[4320]: Reached target Timers.
Jan 26 04:01:12 np0005595445 systemd[4320]: Starting D-Bus User Message Bus Socket...
Jan 26 04:01:12 np0005595445 systemd[4320]: Starting Create User's Volatile Files and Directories...
Jan 26 04:01:12 np0005595445 systemd[4320]: Finished Create User's Volatile Files and Directories.
Jan 26 04:01:12 np0005595445 systemd[4320]: Listening on D-Bus User Message Bus Socket.
Jan 26 04:01:12 np0005595445 systemd[4320]: Reached target Sockets.
Jan 26 04:01:12 np0005595445 systemd[4320]: Reached target Basic System.
Jan 26 04:01:12 np0005595445 systemd[4320]: Reached target Main User Target.
Jan 26 04:01:12 np0005595445 systemd[4320]: Startup finished in 97ms.
Jan 26 04:01:12 np0005595445 systemd[1]: Started User Manager for UID 1000.
Jan 26 04:01:12 np0005595445 systemd[1]: Started Session 1 of User zuul.
Jan 26 04:01:13 np0005595445 python3[4402]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:01:16 np0005595445 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 04:01:16 np0005595445 python3[4432]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:01:23 np0005595445 python3[4490]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:01:24 np0005595445 python3[4530]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 04:01:26 np0005595445 python3[4556]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDcdeFK+2Uzt/crRIxrw7Ii0Wo86Wha7SQ4BMdscA2exHPGkBWYBIRcQLWh4xXNIJqC/AbzacMgTYrRAOvShuuLUpKyUrfzg1ixfRmJf9fdw2BnSl3RjaKwYMifr2EHSvqhf5bD53uBkC+IdHfTnkuZk6EY16XIhr9eCxuKNHAwKJpnEOyw1gCntHfxFz0wBfy4kv0fT3TjsjCqzDNTzpWx8b5EO9vxnMmoYiZfDcbf2IFeK5LN6O1oAinJsvJV4PpR7ajuvFx5ScMj/FmW42D4VqeCnnHNS5dWt8JHxwY3glRh2xbY1AFfOTDQ7mJSgDV1rY+vTDOxZH3NcovSw7e0hh1Qt3oRYf47AAcmQdH72ljw6N0w34lxQMgBXA4gr6gzREYttTLX3EzRinYa6SypE2Grj5mT9zmv/OvQcULUWVTP443n0NBQIl+NzQqTOwT0s5E1arsVCcgSGTH/tsVlIFM7jffJDMuZorpoMWq/apou6G84JCh7dpggM1MDTCM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:26 np0005595445 python3[4580]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:27 np0005595445 python3[4679]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:27 np0005595445 python3[4750]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418087.1324046-252-45805790609404/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=ce1a2ab08f8a45f8bb0154795a55a641_id_rsa follow=False checksum=50f004a61600a842dd3e22b3105eef0d4eef20ff backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:28 np0005595445 python3[4873]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:28 np0005595445 python3[4944]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418088.1256952-307-195949309720022/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=ce1a2ab08f8a45f8bb0154795a55a641_id_rsa.pub follow=False checksum=c08207c95d113b3d2dc53dab777685d45917b3fb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:30 np0005595445 python3[4992]: ansible-ping Invoked with data=pong
Jan 26 04:01:31 np0005595445 python3[5016]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:01:33 np0005595445 python3[5074]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 04:01:34 np0005595445 python3[5106]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:34 np0005595445 python3[5130]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:34 np0005595445 python3[5154]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:35 np0005595445 python3[5178]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:35 np0005595445 python3[5202]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:35 np0005595445 python3[5226]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:37 np0005595445 python3[5252]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:38 np0005595445 python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:38 np0005595445 python3[5403]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418097.9074206-32-165934380594243/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:39 np0005595445 python3[5451]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:39 np0005595445 python3[5475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:40 np0005595445 python3[5499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:40 np0005595445 python3[5523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:40 np0005595445 python3[5547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:40 np0005595445 python3[5571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:41 np0005595445 python3[5595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:41 np0005595445 python3[5619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:41 np0005595445 python3[5643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:41 np0005595445 python3[5667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:42 np0005595445 python3[5691]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:42 np0005595445 python3[5715]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:42 np0005595445 python3[5739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:42 np0005595445 python3[5763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:43 np0005595445 python3[5787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:43 np0005595445 python3[5811]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:43 np0005595445 python3[5835]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:43 np0005595445 python3[5859]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:44 np0005595445 python3[5883]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:44 np0005595445 python3[5907]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:44 np0005595445 python3[5931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:45 np0005595445 python3[5955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:45 np0005595445 python3[5979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:45 np0005595445 python3[6003]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:45 np0005595445 python3[6027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:46 np0005595445 python3[6051]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:01:49 np0005595445 python3[6077]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 04:01:49 np0005595445 systemd[1]: Starting Time & Date Service...
Jan 26 04:01:49 np0005595445 systemd[1]: Started Time & Date Service.
Jan 26 04:01:49 np0005595445 systemd-timedated[6079]: Changed time zone to 'UTC' (UTC).
Jan 26 04:01:50 np0005595445 python3[6108]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:50 np0005595445 python3[6184]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:50 np0005595445 python3[6255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769418110.2552183-252-55896218177457/source _original_basename=tmp5wnjhfyl follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:51 np0005595445 python3[6355]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:51 np0005595445 python3[6426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769418111.1008055-302-53741488510479/source _original_basename=tmptmds0qw7 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:52 np0005595445 python3[6528]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:52 np0005595445 python3[6601]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769418112.239112-382-261840777848615/source _original_basename=tmpdxungfil follow=False checksum=f8e7a25c67610e75d05bab7943d515214e034b21 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:53 np0005595445 python3[6649]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:01:54 np0005595445 python3[6675]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:01:54 np0005595445 python3[6755]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:01:54 np0005595445 python3[6828]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418114.3463724-452-230568505334353/source _original_basename=tmplnvbqk7o follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:01:55 np0005595445 python3[6879]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-0dcc-2282-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:01:56 np0005595445 python3[6907]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-0dcc-2282-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 04:01:57 np0005595445 python3[6935]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:02:18 np0005595445 python3[6961]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:02:19 np0005595445 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 04:03:18 np0005595445 systemd-logind[783]: Session 1 logged out. Waiting for processes to exit.
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 04:03:25 np0005595445 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 04:03:25 np0005595445 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.1987] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 04:03:25 np0005595445 systemd-udevd[6965]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2140] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2163] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2165] device (eth1): carrier: link connected
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2168] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2173] policy: auto-activating connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2177] device (eth1): Activation: starting connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2178] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2180] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2183] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:03:25 np0005595445 NetworkManager[856]: <info>  [1769418205.2187] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:03:25 np0005595445 systemd[4320]: Starting Mark boot as successful...
Jan 26 04:03:25 np0005595445 systemd[4320]: Finished Mark boot as successful.
Jan 26 04:03:25 np0005595445 systemd-logind[783]: New session 3 of user zuul.
Jan 26 04:03:25 np0005595445 systemd[1]: Started Session 3 of User zuul.
Jan 26 04:03:26 np0005595445 python3[6996]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-b722-d7c5-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:03:36 np0005595445 python3[7076]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:03:36 np0005595445 python3[7149]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769418215.87125-155-214241887746077/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=6ef8a714238b6321d3120b756c9e2e82d4276bd3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:03:36 np0005595445 python3[7199]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:03:36 np0005595445 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 04:03:36 np0005595445 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 04:03:36 np0005595445 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 04:03:36 np0005595445 systemd[1]: Stopping Network Manager...
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9605] caught SIGTERM, shutting down normally.
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9616] dhcp4 (eth0): canceled DHCP transaction
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9617] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9617] dhcp4 (eth0): state changed no lease
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9619] manager: NetworkManager state is now CONNECTING
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9707] dhcp4 (eth1): canceled DHCP transaction
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9707] dhcp4 (eth1): state changed no lease
Jan 26 04:03:36 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:03:36 np0005595445 NetworkManager[856]: <info>  [1769418216.9775] exiting (success)
Jan 26 04:03:36 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:03:36 np0005595445 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 04:03:36 np0005595445 systemd[1]: Stopped Network Manager.
Jan 26 04:03:36 np0005595445 systemd[1]: NetworkManager.service: Consumed 1.036s CPU time, 9.9M memory peak.
Jan 26 04:03:36 np0005595445 systemd[1]: Starting Network Manager...
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0195] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0196] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0253] manager[0x558bef871000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 04:03:37 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 04:03:37 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0941] hostname: hostname: using hostnamed
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0941] hostname: static hostname changed from (none) to "np0005595445.novalocal"
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0947] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0952] manager[0x558bef871000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0953] manager[0x558bef871000]: rfkill: WWAN hardware radio set enabled
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0977] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0977] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0978] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0978] manager: Networking is enabled by state file
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0980] settings: Loaded settings plugin: keyfile (internal)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.0983] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1003] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1011] dhcp: init: Using DHCP client 'internal'
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1013] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1017] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1027] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1032] device (eth0): carrier: link connected
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1036] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1040] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1040] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1045] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1051] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1055] device (eth1): carrier: link connected
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1058] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1062] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b) (indicated)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1062] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1066] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1071] device (eth1): Activation: starting connection 'Wired connection 1' (01428c05-ae5c-3e90-b32b-9a07a8ac9b4b)
Jan 26 04:03:37 np0005595445 systemd[1]: Started Network Manager.
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1077] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1081] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1082] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1083] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1085] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1087] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1089] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1090] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1092] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1097] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1099] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1105] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1107] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1128] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1130] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1134] device (lo): Activation: successful, device activated.
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1140] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1145] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1200] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1236] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1238] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1240] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1241] device (eth0): Activation: successful, device activated.
Jan 26 04:03:37 np0005595445 systemd[1]: Starting Network Manager Wait Online...
Jan 26 04:03:37 np0005595445 NetworkManager[7211]: <info>  [1769418217.1245] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 04:03:37 np0005595445 python3[7283]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-b722-d7c5-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:03:47 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:04:07 np0005595445 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3710] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 04:04:22 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:04:22 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3925] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3929] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3941] device (eth1): Activation: successful, device activated.
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3949] manager: startup complete
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3952] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <warn>  [1769418262.3962] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.3972] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 systemd[1]: Finished Network Manager Wait Online.
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4067] dhcp4 (eth1): canceled DHCP transaction
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4068] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4068] dhcp4 (eth1): state changed no lease
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4084] policy: auto-activating connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4089] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4090] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4095] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4103] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4115] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4147] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4149] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:04:22 np0005595445 NetworkManager[7211]: <info>  [1769418262.4156] device (eth1): Activation: successful, device activated.
Jan 26 04:04:32 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:04:37 np0005595445 systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 04:04:37 np0005595445 systemd[1]: session-3.scope: Consumed 1.298s CPU time.
Jan 26 04:04:37 np0005595445 systemd-logind[783]: Session 3 logged out. Waiting for processes to exit.
Jan 26 04:04:37 np0005595445 systemd-logind[783]: Removed session 3.
Jan 26 04:05:18 np0005595445 systemd-logind[783]: New session 4 of user zuul.
Jan 26 04:05:18 np0005595445 systemd[1]: Started Session 4 of User zuul.
Jan 26 04:05:18 np0005595445 python3[7392]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:05:18 np0005595445 python3[7465]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418318.361261-373-179846650525927/source _original_basename=tmpr0ymogg4 follow=False checksum=10e957977f1ea6bc363530e08bbb212998c5f9af backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:05:22 np0005595445 systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 04:05:22 np0005595445 systemd-logind[783]: Session 4 logged out. Waiting for processes to exit.
Jan 26 04:05:22 np0005595445 systemd-logind[783]: Removed session 4.
Jan 26 04:07:10 np0005595445 systemd[4320]: Created slice User Background Tasks Slice.
Jan 26 04:07:10 np0005595445 systemd[4320]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 04:07:10 np0005595445 systemd[4320]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 04:12:21 np0005595445 systemd-logind[783]: New session 5 of user zuul.
Jan 26 04:12:21 np0005595445 systemd[1]: Started Session 5 of User zuul.
Jan 26 04:12:21 np0005595445 python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-680e-3fc1-00000000217d-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:22 np0005595445 python3[7555]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:23 np0005595445 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:23 np0005595445 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:23 np0005595445 python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:24 np0005595445 python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:25 np0005595445 python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:12:25 np0005595445 python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418745.0358005-538-68336395017277/source _original_basename=tmpfgmn5bfn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:12:27 np0005595445 python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:12:27 np0005595445 systemd[1]: Reloading.
Jan 26 04:12:27 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:12:28 np0005595445 python3[7916]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 04:12:29 np0005595445 python3[7942]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:29 np0005595445 python3[7970]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:29 np0005595445 python3[7998]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:30 np0005595445 python3[8026]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:31 np0005595445 python3[8053]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-680e-3fc1-000000002184-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:12:31 np0005595445 python3[8083]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 04:12:34 np0005595445 systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 04:12:34 np0005595445 systemd[1]: session-5.scope: Consumed 3.868s CPU time.
Jan 26 04:12:34 np0005595445 systemd-logind[783]: Session 5 logged out. Waiting for processes to exit.
Jan 26 04:12:34 np0005595445 systemd-logind[783]: Removed session 5.
Jan 26 04:12:36 np0005595445 systemd-logind[783]: New session 6 of user zuul.
Jan 26 04:12:36 np0005595445 systemd[1]: Started Session 6 of User zuul.
Jan 26 04:12:36 np0005595445 python3[8117]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 04:12:43 np0005595445 setsebool[8160]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 04:12:43 np0005595445 setsebool[8160]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 04:12:57 np0005595445 kernel: SELinux:  Converting 386 SID table entries...
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:12:57 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  Converting 389 SID table entries...
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:13:08 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:13:26 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 04:13:26 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:13:26 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:13:26 np0005595445 systemd[1]: Reloading.
Jan 26 04:13:26 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:13:27 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:13:33 np0005595445 python3[14013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-9270-47f7-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:13:34 np0005595445 kernel: evm: overlay not supported
Jan 26 04:13:34 np0005595445 systemd[4320]: Starting D-Bus User Message Bus...
Jan 26 04:13:34 np0005595445 dbus-broker-launch[14550]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 04:13:34 np0005595445 dbus-broker-launch[14550]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 04:13:34 np0005595445 systemd[4320]: Started D-Bus User Message Bus.
Jan 26 04:13:34 np0005595445 dbus-broker-lau[14550]: Ready
Jan 26 04:13:34 np0005595445 systemd[4320]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 04:13:34 np0005595445 systemd[4320]: Created slice Slice /user.
Jan 26 04:13:34 np0005595445 systemd[4320]: podman-14476.scope: unit configures an IP firewall, but not running as root.
Jan 26 04:13:34 np0005595445 systemd[4320]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 04:13:34 np0005595445 systemd[4320]: Started podman-14476.scope.
Jan 26 04:13:34 np0005595445 systemd[4320]: Started podman-pause-07735550.scope.
Jan 26 04:13:35 np0005595445 systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 04:13:35 np0005595445 systemd[1]: session-6.scope: Consumed 47.857s CPU time.
Jan 26 04:13:35 np0005595445 systemd-logind[783]: Session 6 logged out. Waiting for processes to exit.
Jan 26 04:13:35 np0005595445 systemd-logind[783]: Removed session 6.
Jan 26 04:13:54 np0005595445 systemd-logind[783]: New session 7 of user zuul.
Jan 26 04:13:54 np0005595445 systemd[1]: Started Session 7 of User zuul.
Jan 26 04:13:55 np0005595445 python3[23767]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:13:55 np0005595445 python3[23969]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:13:56 np0005595445 python3[24429]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005595445.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 04:13:57 np0005595445 python3[24649]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAtPEUMhsrPMrPBtrW9DN2hphJag2++Oa2+RW/bLZtOuHDUx3O2VLMTPtjIlKZnSfgaaLhnryF0u4SgeMfgHgP8= zuul@np0005595443.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 04:13:57 np0005595445 python3[24928]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:13:58 np0005595445 python3[25187]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769418837.356957-151-20203635944896/source _original_basename=tmpy_kqy21s follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:13:58 np0005595445 python3[25547]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 26 04:13:58 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 04:13:58 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 04:13:59 np0005595445 systemd-hostnamed[25675]: Changed pretty hostname to 'compute-1'
Jan 26 04:13:59 np0005595445 systemd-hostnamed[25675]: Hostname set to <compute-1> (static)
Jan 26 04:13:59 np0005595445 NetworkManager[7211]: <info>  [1769418839.0098] hostname: static hostname changed from "np0005595445.novalocal" to "compute-1"
Jan 26 04:13:59 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:13:59 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:13:59 np0005595445 systemd-logind[783]: Session 7 logged out. Waiting for processes to exit.
Jan 26 04:13:59 np0005595445 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 04:13:59 np0005595445 systemd[1]: session-7.scope: Consumed 2.258s CPU time.
Jan 26 04:13:59 np0005595445 systemd-logind[783]: Removed session 7.
Jan 26 04:14:09 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:14:10 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:14:10 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:14:10 np0005595445 systemd[1]: man-db-cache-update.service: Consumed 51.077s CPU time.
Jan 26 04:14:10 np0005595445 systemd[1]: run-r369e89110e9a474b95e0da52deae43cf.service: Deactivated successfully.
Jan 26 04:14:29 np0005595445 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 04:16:10 np0005595445 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 04:16:10 np0005595445 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 04:16:10 np0005595445 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 04:16:10 np0005595445 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 04:18:03 np0005595445 systemd-logind[783]: New session 8 of user zuul.
Jan 26 04:18:03 np0005595445 systemd[1]: Started Session 8 of User zuul.
Jan 26 04:18:04 np0005595445 python3[30002]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:18:06 np0005595445 python3[30119]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:06 np0005595445 python3[30192]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:06 np0005595445 python3[30218]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:07 np0005595445 python3[30291]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:07 np0005595445 python3[30317]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:07 np0005595445 python3[30390]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:08 np0005595445 python3[30416]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:08 np0005595445 python3[30489]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:08 np0005595445 python3[30515]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:09 np0005595445 python3[30588]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:09 np0005595445 python3[30614]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:09 np0005595445 python3[30687]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:09 np0005595445 python3[30713]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:18:10 np0005595445 python3[30786]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769419085.905869-33987-71487597389602/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:18:23 np0005595445 python3[30834]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:23:22 np0005595445 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 04:23:22 np0005595445 systemd[1]: session-8.scope: Consumed 4.856s CPU time.
Jan 26 04:23:22 np0005595445 systemd-logind[783]: Session 8 logged out. Waiting for processes to exit.
Jan 26 04:23:22 np0005595445 systemd-logind[783]: Removed session 8.
Jan 26 04:29:30 np0005595445 systemd-logind[783]: New session 9 of user zuul.
Jan 26 04:29:30 np0005595445 systemd[1]: Started Session 9 of User zuul.
Jan 26 04:29:32 np0005595445 python3.9[31076]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:29:33 np0005595445 python3.9[31257]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:29:43 np0005595445 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 04:29:43 np0005595445 systemd[1]: session-9.scope: Consumed 7.655s CPU time.
Jan 26 04:29:43 np0005595445 systemd-logind[783]: Session 9 logged out. Waiting for processes to exit.
Jan 26 04:29:43 np0005595445 systemd-logind[783]: Removed session 9.
Jan 26 04:29:59 np0005595445 systemd-logind[783]: New session 10 of user zuul.
Jan 26 04:29:59 np0005595445 systemd[1]: Started Session 10 of User zuul.
Jan 26 04:29:59 np0005595445 python3.9[31471]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 04:30:01 np0005595445 python3.9[31645]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:30:02 np0005595445 python3.9[31797]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:30:03 np0005595445 python3.9[31950]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:30:04 np0005595445 python3.9[32102]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:30:05 np0005595445 python3.9[32254]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:30:05 np0005595445 python3.9[32377]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769419804.55097-173-203606680384125/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:30:06 np0005595445 python3.9[32529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:30:07 np0005595445 python3.9[32685]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:30:07 np0005595445 python3.9[32837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:30:08 np0005595445 python3.9[32987]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:30:16 np0005595445 python3.9[33244]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:30:16 np0005595445 python3.9[33394]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:30:18 np0005595445 python3.9[33548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:30:19 np0005595445 python3.9[33706]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:30:20 np0005595445 python3.9[33792]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:31:03 np0005595445 systemd[1]: Reloading.
Jan 26 04:31:03 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:31:03 np0005595445 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 04:31:04 np0005595445 systemd[1]: Reloading.
Jan 26 04:31:04 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:31:04 np0005595445 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 04:31:04 np0005595445 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 04:31:04 np0005595445 systemd[1]: Reloading.
Jan 26 04:31:04 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:31:04 np0005595445 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 04:31:05 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:31:05 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:31:05 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:32:12 np0005595445 kernel: SELinux:  Converting 2723 SID table entries...
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:32:12 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:32:13 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 04:32:13 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:32:13 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:32:13 np0005595445 systemd[1]: Reloading.
Jan 26 04:32:13 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:32:13 np0005595445 systemd[1]: Starting dnf makecache...
Jan 26 04:32:13 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:32:13 np0005595445 dnf[34493]: Failed determining last makecache time.
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-openstack-barbican-42b4c41831408a8e323 110 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 189 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-openstack-cinder-1c00d6490d88e436f26ef 181 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-python-stevedore-c4acc5639fd2329372142 170 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-python-cloudkitty-tests-tempest-2c80f8 179 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-os-refresh-config-9bfc52b5049be2d8de61 159 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 173 kB/s | 3.0 kB     00:00
Jan 26 04:32:13 np0005595445 dnf[34493]: delorean-python-designate-tests-tempest-347fdbc 137 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-glance-1fd12c29b339f30fe823e 149 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 180 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-manila-3c01b7181572c95dac462 181 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-python-whitebox-neutron-tests-tempest- 176 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-octavia-ba397f07a7331190208c 140 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-watcher-c014f81a8647287f6dcc 158 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-ansible-config_template-5ccaa22121a7ff 181 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 177 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-swift-dc98a8463506ac520c469a 163 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-python-tempestconf-8515371b7cceebd4282 134 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: delorean-openstack-heat-ui-013accbfd179753bc3f0 140 kB/s | 3.0 kB     00:00
Jan 26 04:32:14 np0005595445 dnf[34493]: CentOS Stream 9 - BaseOS                         61 kB/s | 6.7 kB     00:00
Jan 26 04:32:14 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:32:14 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:32:14 np0005595445 systemd[1]: man-db-cache-update.service: Consumed 1.198s CPU time.
Jan 26 04:32:14 np0005595445 systemd[1]: run-rb9c92ec70e5d4f1e8cf6b98243a9f807.service: Deactivated successfully.
Jan 26 04:32:14 np0005595445 dnf[34493]: CentOS Stream 9 - AppStream                      29 kB/s | 6.8 kB     00:00
Jan 26 04:32:14 np0005595445 python3.9[35409]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:32:14 np0005595445 dnf[34493]: CentOS Stream 9 - CRB                            58 kB/s | 6.6 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: CentOS Stream 9 - Extras packages                64 kB/s | 7.3 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: dlrn-antelope-testing                           108 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: dlrn-antelope-build-deps                        149 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: centos9-rabbitmq                                 35 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: centos9-storage                                 106 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: centos9-opstools                                121 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: NFV SIG OpenvSwitch                             115 kB/s | 3.0 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: repo-setup-centos-appstream                     113 kB/s | 4.4 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: repo-setup-centos-baseos                         95 kB/s | 3.9 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: repo-setup-centos-highavailability              142 kB/s | 3.9 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: repo-setup-centos-powertools                    204 kB/s | 4.3 kB     00:00
Jan 26 04:32:15 np0005595445 dnf[34493]: Extra Packages for Enterprise Linux 9 - x86_64  247 kB/s |  31 kB     00:00
Jan 26 04:32:16 np0005595445 dnf[34493]: Metadata cache created.
Jan 26 04:32:16 np0005595445 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 04:32:16 np0005595445 systemd[1]: Finished dnf makecache.
Jan 26 04:32:16 np0005595445 systemd[1]: dnf-makecache.service: Consumed 1.942s CPU time.
Jan 26 04:32:16 np0005595445 python3.9[35712]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 04:32:17 np0005595445 python3.9[35865]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 04:32:20 np0005595445 python3.9[36018]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:32:23 np0005595445 python3.9[36170]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 04:32:24 np0005595445 python3.9[36322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:32:27 np0005595445 python3.9[36476]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:32:31 np0005595445 python3.9[36599]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769419947.4861498-662-124391614481009/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:32:32 np0005595445 python3.9[36753]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:32:34 np0005595445 python3.9[36905]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:32:34 np0005595445 python3.9[37058]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:32:36 np0005595445 python3.9[37210]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 04:32:36 np0005595445 python3.9[37363]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 04:32:37 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:32:37 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:32:38 np0005595445 python3.9[37522]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 04:32:39 np0005595445 python3.9[37682]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 04:32:40 np0005595445 python3.9[37835]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 04:32:40 np0005595445 python3.9[37993]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 04:32:41 np0005595445 python3.9[38145]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:32:45 np0005595445 python3.9[38298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:32:46 np0005595445 python3.9[38450]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:32:46 np0005595445 python3.9[38573]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769419965.6798482-1019-60321278574533/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:32:47 np0005595445 python3.9[38725]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:32:47 np0005595445 systemd[1]: Starting Load Kernel Modules...
Jan 26 04:32:47 np0005595445 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 04:32:47 np0005595445 kernel: Bridge firewalling registered
Jan 26 04:32:47 np0005595445 systemd-modules-load[38729]: Inserted module 'br_netfilter'
Jan 26 04:32:47 np0005595445 systemd[1]: Finished Load Kernel Modules.
Jan 26 04:32:48 np0005595445 python3.9[38885]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:32:49 np0005595445 python3.9[39008]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769419968.1550276-1088-254231695047924/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:32:50 np0005595445 python3.9[39160]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:32:53 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:32:53 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:32:53 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:32:53 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:32:53 np0005595445 systemd[1]: Reloading.
Jan 26 04:32:53 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:32:53 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:32:55 np0005595445 python3.9[40888]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:32:56 np0005595445 python3.9[41800]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 04:32:57 np0005595445 python3.9[42508]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:32:57 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:32:57 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:32:57 np0005595445 systemd[1]: man-db-cache-update.service: Consumed 5.302s CPU time.
Jan 26 04:32:57 np0005595445 systemd[1]: run-re983052085a14bd191ab4bb571acf307.service: Deactivated successfully.
Jan 26 04:32:57 np0005595445 python3.9[43337]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:32:58 np0005595445 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 04:32:58 np0005595445 systemd[1]: Starting Authorization Manager...
Jan 26 04:32:58 np0005595445 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 04:32:58 np0005595445 polkitd[43555]: Started polkitd version 0.117
Jan 26 04:32:58 np0005595445 systemd[1]: Started Authorization Manager.
Jan 26 04:32:59 np0005595445 python3.9[43725]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:32:59 np0005595445 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 04:32:59 np0005595445 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 04:32:59 np0005595445 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 04:32:59 np0005595445 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 04:32:59 np0005595445 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 04:33:00 np0005595445 python3.9[43886]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 04:33:03 np0005595445 python3.9[44038]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:33:04 np0005595445 systemd[1]: Reloading.
Jan 26 04:33:04 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:33:05 np0005595445 python3.9[44228]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:33:05 np0005595445 systemd[1]: Reloading.
Jan 26 04:33:05 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:33:06 np0005595445 python3.9[44417]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:33:06 np0005595445 python3.9[44570]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:33:06 np0005595445 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 04:33:07 np0005595445 python3.9[44723]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:33:09 np0005595445 python3.9[44885]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:33:10 np0005595445 python3.9[45038]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:33:10 np0005595445 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 04:33:10 np0005595445 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 04:33:10 np0005595445 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 04:33:10 np0005595445 systemd[1]: Starting Apply Kernel Variables...
Jan 26 04:33:10 np0005595445 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 04:33:10 np0005595445 systemd[1]: Finished Apply Kernel Variables.
Jan 26 04:33:11 np0005595445 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 04:33:11 np0005595445 systemd[1]: session-10.scope: Consumed 2min 21.532s CPU time.
Jan 26 04:33:11 np0005595445 systemd-logind[783]: Session 10 logged out. Waiting for processes to exit.
Jan 26 04:33:11 np0005595445 systemd-logind[783]: Removed session 10.
Jan 26 04:33:16 np0005595445 systemd-logind[783]: New session 11 of user zuul.
Jan 26 04:33:16 np0005595445 systemd[1]: Started Session 11 of User zuul.
Jan 26 04:33:17 np0005595445 python3.9[45223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:33:18 np0005595445 python3.9[45379]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 04:33:19 np0005595445 python3.9[45532]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 04:33:20 np0005595445 python3.9[45690]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 04:33:21 np0005595445 python3.9[45850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:33:22 np0005595445 python3.9[45936]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 04:33:25 np0005595445 python3.9[46100]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:33:39 np0005595445 kernel: SELinux:  Converting 2736 SID table entries...
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:33:39 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:33:39 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 04:33:39 np0005595445 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 04:33:41 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:33:41 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:33:41 np0005595445 systemd[1]: Reloading.
Jan 26 04:33:41 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:33:41 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:33:41 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:33:42 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:33:42 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:33:42 np0005595445 systemd[1]: run-r0fceec3173db4869931d59b965793161.service: Deactivated successfully.
Jan 26 04:33:43 np0005595445 python3.9[47199]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:33:43 np0005595445 systemd[1]: Reloading.
Jan 26 04:33:43 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:33:43 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:33:44 np0005595445 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 04:33:44 np0005595445 chown[47241]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 04:33:44 np0005595445 ovs-ctl[47246]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 04:33:44 np0005595445 ovs-ctl[47246]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 04:33:44 np0005595445 ovs-ctl[47246]: Starting ovsdb-server [  OK  ]
Jan 26 04:33:44 np0005595445 ovs-vsctl[47295]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 04:33:44 np0005595445 ovs-vsctl[47315]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"5f259fb6-5896-4c89-8853-1dd537a2ebf7\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 04:33:44 np0005595445 ovs-ctl[47246]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 04:33:44 np0005595445 ovs-ctl[47246]: Enabling remote OVSDB managers [  OK  ]
Jan 26 04:33:44 np0005595445 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 04:33:44 np0005595445 ovs-vsctl[47321]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 04:33:44 np0005595445 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 04:33:44 np0005595445 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 04:33:44 np0005595445 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 04:33:44 np0005595445 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 04:33:44 np0005595445 ovs-ctl[47366]: Inserting openvswitch module [  OK  ]
Jan 26 04:33:44 np0005595445 ovs-ctl[47334]: Starting ovs-vswitchd [  OK  ]
Jan 26 04:33:44 np0005595445 ovs-ctl[47334]: Enabling remote OVSDB managers [  OK  ]
Jan 26 04:33:44 np0005595445 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 04:33:44 np0005595445 ovs-vsctl[47384]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 26 04:33:44 np0005595445 systemd[1]: Starting Open vSwitch...
Jan 26 04:33:44 np0005595445 systemd[1]: Finished Open vSwitch.
Jan 26 04:33:45 np0005595445 python3.9[47535]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:33:46 np0005595445 python3.9[47687]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 04:33:47 np0005595445 kernel: SELinux:  Converting 2750 SID table entries...
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:33:47 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:33:48 np0005595445 python3.9[47842]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:33:49 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 04:33:49 np0005595445 python3.9[48000]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:33:52 np0005595445 python3.9[48153]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:33:53 np0005595445 python3.9[48440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 04:33:54 np0005595445 python3.9[48592]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:33:55 np0005595445 python3.9[48746]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:33:59 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:33:59 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:33:59 np0005595445 systemd[1]: Reloading.
Jan 26 04:33:59 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:33:59 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:33:59 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:34:00 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:34:00 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:34:00 np0005595445 systemd[1]: run-r803c64bf4b0f4d7e97c0d98a835bf6b5.service: Deactivated successfully.
Jan 26 04:34:01 np0005595445 python3.9[49064]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:34:01 np0005595445 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 04:34:01 np0005595445 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 04:34:01 np0005595445 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 04:34:01 np0005595445 systemd[1]: Stopping Network Manager...
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5308] caught SIGTERM, shutting down normally.
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): canceled DHCP transaction
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5325] dhcp4 (eth0): state changed no lease
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5328] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 04:34:01 np0005595445 NetworkManager[7211]: <info>  [1769420041.5442] exiting (success)
Jan 26 04:34:01 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:34:01 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:34:01 np0005595445 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 04:34:01 np0005595445 systemd[1]: Stopped Network Manager.
Jan 26 04:34:01 np0005595445 systemd[1]: NetworkManager.service: Consumed 11.672s CPU time, 4.1M memory peak, read 0B from disk, written 39.0K to disk.
Jan 26 04:34:01 np0005595445 systemd[1]: Starting Network Manager...
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6070] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ddaf00d4-1dc5-4d7f-b5ab-626f2ab79e8a)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6073] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6139] manager[0x563bd18df000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 04:34:01 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 04:34:01 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6894] hostname: hostname: using hostnamed
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6901] hostname: static hostname changed from (none) to "compute-1"
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6908] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6913] manager[0x563bd18df000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6913] manager[0x563bd18df000]: rfkill: WWAN hardware radio set enabled
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6939] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6950] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6951] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6952] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6952] manager: Networking is enabled by state file
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6954] settings: Loaded settings plugin: keyfile (internal)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6959] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.6990] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7004] dhcp: init: Using DHCP client 'internal'
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7007] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7014] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7021] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7035] device (lo): Activation: starting connection 'lo' (02fbee44-d9d6-4838-9f7e-bcbf35b7d384)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7044] device (eth0): carrier: link connected
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7048] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7053] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7054] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7060] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7067] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7072] device (eth1): carrier: link connected
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7076] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7081] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f) (indicated)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7081] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7086] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7092] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7098] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 04:34:01 np0005595445 systemd[1]: Started Network Manager.
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7107] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7109] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7111] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7113] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7116] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7119] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7121] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7124] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7130] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7134] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7151] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7172] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7184] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7189] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7192] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7198] device (lo): Activation: successful, device activated.
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7212] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 04:34:01 np0005595445 systemd[1]: Starting Network Manager Wait Online...
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7306] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7314] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7316] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7320] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7322] device (eth1): Activation: successful, device activated.
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7347] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7348] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7354] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7358] device (eth0): Activation: successful, device activated.
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7364] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 04:34:01 np0005595445 NetworkManager[49073]: <info>  [1769420041.7368] manager: startup complete
Jan 26 04:34:01 np0005595445 systemd[1]: Finished Network Manager Wait Online.
Jan 26 04:34:02 np0005595445 python3.9[49290]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:34:09 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:34:09 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:34:09 np0005595445 systemd[1]: Reloading.
Jan 26 04:34:09 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:34:09 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:34:09 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:34:10 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:34:10 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:34:10 np0005595445 systemd[1]: run-r2118850b088a4b949e0110ec8d64533f.service: Deactivated successfully.
Jan 26 04:34:11 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:34:12 np0005595445 python3.9[49750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:34:12 np0005595445 python3.9[49902]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:13 np0005595445 python3.9[50056]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:14 np0005595445 python3.9[50208]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:15 np0005595445 python3.9[50360]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:15 np0005595445 python3.9[50512]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:16 np0005595445 python3.9[50664]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:34:17 np0005595445 python3.9[50789]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420056.1948519-643-205495253638858/.source _original_basename=.tievaic0 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:18 np0005595445 python3.9[50941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:18 np0005595445 python3.9[51093]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 04:34:19 np0005595445 python3.9[51245]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:22 np0005595445 python3.9[51672]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 04:34:23 np0005595445 ansible-async_wrapper.py[51847]: Invoked with j856592117221 300 /home/zuul/.ansible/tmp/ansible-tmp-1769420062.332005-841-155806645680929/AnsiballZ_edpm_os_net_config.py _
Jan 26 04:34:23 np0005595445 ansible-async_wrapper.py[51850]: Starting module and watcher
Jan 26 04:34:23 np0005595445 ansible-async_wrapper.py[51850]: Start watching 51851 (300)
Jan 26 04:34:23 np0005595445 ansible-async_wrapper.py[51851]: Start module (51851)
Jan 26 04:34:23 np0005595445 ansible-async_wrapper.py[51847]: Return async_wrapper task started.
Jan 26 04:34:23 np0005595445 python3.9[51852]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 04:34:24 np0005595445 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 04:34:24 np0005595445 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 04:34:24 np0005595445 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 04:34:24 np0005595445 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 04:34:24 np0005595445 kernel: cfg80211: failed to load regulatory.db
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.2481] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.2504] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3057] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3060] audit: op="connection-add" uuid="3cc5a451-bfd9-4013-aa49-a471bbbe10a9" name="br-ex-br" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3079] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3081] audit: op="connection-add" uuid="9153e954-4bca-40ba-b358-459a063898be" name="br-ex-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3098] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3100] audit: op="connection-add" uuid="d42f5f9a-7cf2-4296-9705-7315b5bff1d8" name="eth1-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3116] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3118] audit: op="connection-add" uuid="1fe6ef39-ff44-4c4a-b7a1-fca2c446f007" name="vlan20-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3131] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3134] audit: op="connection-add" uuid="939b3879-66d8-4592-a961-3770dc5e15ea" name="vlan21-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3148] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3150] audit: op="connection-add" uuid="b6a2cca4-fb3d-4b00-8296-0f5997fdc3ce" name="vlan22-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3162] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3164] audit: op="connection-add" uuid="5d1bc407-4dcf-4a9f-9d44-794d818d114b" name="vlan23-port" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3184] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,connection.timestamp" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3199] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3201] audit: op="connection-add" uuid="083c5da3-5b2f-4a68-b3d7-35d321110d3b" name="br-ex-if" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3340] audit: op="connection-update" uuid="005e0c67-cd91-5f06-b6a9-1e083d81271f" name="ci-private-network" args="ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.addresses,ipv4.never-default,ovs-interface.type,ipv6.dns,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv6.method,ipv6.addresses,connection.port-type,connection.controller,connection.slave-type,connection.timestamp,connection.master,ovs-external-ids.data" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3357] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3359] audit: op="connection-add" uuid="400efc6f-8710-4783-a66c-228c37e355f6" name="vlan20-if" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3374] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3376] audit: op="connection-add" uuid="ef305799-d499-43fd-b8a7-1fa9c3f71516" name="vlan21-if" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3397] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3399] audit: op="connection-add" uuid="13f21556-e6b4-46b6-9b87-1f2636ded6fb" name="vlan22-if" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3415] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3417] audit: op="connection-add" uuid="3926ba24-a6cd-49fb-8bda-4960a92c7804" name="vlan23-if" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3429] audit: op="connection-delete" uuid="01428c05-ae5c-3e90-b32b-9a07a8ac9b4b" name="Wired connection 1" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3442] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3445] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3451] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3455] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (3cc5a451-bfd9-4013-aa49-a471bbbe10a9)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3457] audit: op="connection-activate" uuid="3cc5a451-bfd9-4013-aa49-a471bbbe10a9" name="br-ex-br" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3459] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3460] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3465] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3468] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (9153e954-4bca-40ba-b358-459a063898be)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3470] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3472] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3476] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3479] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d42f5f9a-7cf2-4296-9705-7315b5bff1d8)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3481] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3483] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3487] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3491] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1fe6ef39-ff44-4c4a-b7a1-fca2c446f007)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3493] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3494] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3499] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3503] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (939b3879-66d8-4592-a961-3770dc5e15ea)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3505] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3506] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3511] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3514] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (b6a2cca4-fb3d-4b00-8296-0f5997fdc3ce)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3516] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3518] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3522] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3526] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (5d1bc407-4dcf-4a9f-9d44-794d818d114b)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3527] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3530] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3532] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3538] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3540] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3544] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3548] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (083c5da3-5b2f-4a68-b3d7-35d321110d3b)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3550] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3554] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3556] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3558] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3559] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3569] device (eth1): disconnecting for new activation request.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3571] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3574] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3577] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3581] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3585] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3587] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3590] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3595] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (400efc6f-8710-4783-a66c-228c37e355f6)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3596] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3600] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3601] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3603] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3606] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3608] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3611] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3617] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ef305799-d499-43fd-b8a7-1fa9c3f71516)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3618] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3621] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3623] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3625] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3629] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3630] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3634] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3639] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (13f21556-e6b4-46b6-9b87-1f2636ded6fb)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3640] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3644] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3646] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3648] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3651] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <warn>  [1769420065.3653] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3656] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3661] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (3926ba24-a6cd-49fb-8bda-4960a92c7804)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3662] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3665] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3667] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3669] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3671] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3684] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3686] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3690] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3692] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3699] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3719] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 kernel: ovs-system: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3724] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3729] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3731] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3736] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3740] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3744] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 kernel: Timeout policy base is empty
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3746] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3751] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3755] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3759] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3761] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3766] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 systemd-udevd[51857]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3770] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3775] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3777] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3781] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3785] dhcp4 (eth0): canceled DHCP transaction
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3786] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3786] dhcp4 (eth0): state changed no lease
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3788] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3798] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3805] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51853 uid=0 result="fail" reason="Device is not activated"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3811] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3818] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3825] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3832] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3834] dhcp4 (eth0): state changed new lease, address=38.102.83.217
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.3888] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4094] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4099] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4113] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4116] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4121] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4136] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4143] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4145] device (eth1): released from controller device eth1
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4152] device (eth1): disconnecting for new activation request.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4153] audit: op="connection-activate" uuid="005e0c67-cd91-5f06-b6a9-1e083d81271f" name="ci-private-network" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4153] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4154] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4155] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4156] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4158] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4159] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4161] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4167] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4170] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4173] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4177] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4185] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 kernel: br-ex: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4188] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4191] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4195] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4200] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4205] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4209] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4232] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4242] device (eth1): Activation: starting connection 'ci-private-network' (005e0c67-cd91-5f06-b6a9-1e083d81271f)
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4245] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4249] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4252] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4257] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4269] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4271] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 kernel: vlan22: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4450] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4458] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4470] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 kernel: vlan20: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4476] device (eth1): Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 systemd-udevd[51859]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4501] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4532] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4537] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4543] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4624] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4628] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 04:34:25 np0005595445 kernel: vlan21: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4661] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4669] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4700] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4702] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4706] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4712] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4719] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4723] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 kernel: vlan23: entered promiscuous mode
Jan 26 04:34:25 np0005595445 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4810] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4823] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4844] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.4860] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5034] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5038] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5043] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5051] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5053] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 04:34:25 np0005595445 NetworkManager[49073]: <info>  [1769420065.5059] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 04:34:26 np0005595445 NetworkManager[49073]: <info>  [1769420066.6771] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 04:34:26 np0005595445 NetworkManager[49073]: <info>  [1769420066.8431] checkpoint[0x563bd18b5950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 04:34:26 np0005595445 NetworkManager[49073]: <info>  [1769420066.8435] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51853 uid=0 result="success"
Jan 26 04:34:26 np0005595445 python3.9[52215]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=status _async_dir=/root/.ansible_async
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.1716] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.1730] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.4237] audit: op="networking-control" arg="global-dns-configuration" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.4271] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.4309] audit: op="networking-control" arg="global-dns-configuration" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.4329] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.5719] checkpoint[0x563bd18b5a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 04:34:27 np0005595445 NetworkManager[49073]: <info>  [1769420067.5723] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51853 uid=0 result="success"
Jan 26 04:34:27 np0005595445 ansible-async_wrapper.py[51851]: Module complete (51851)
Jan 26 04:34:28 np0005595445 ansible-async_wrapper.py[51850]: Done in kid B.
Jan 26 04:34:30 np0005595445 python3.9[52321]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=status _async_dir=/root/.ansible_async
Jan 26 04:34:30 np0005595445 python3.9[52423]: ansible-ansible.legacy.async_status Invoked with jid=j856592117221.51847 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 04:34:31 np0005595445 python3.9[52575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:34:31 np0005595445 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 04:34:32 np0005595445 python3.9[52700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420071.1945255-922-195691267603125/.source.returncode _original_basename=.ozvuh3cj follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:32 np0005595445 python3.9[52852]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:34:33 np0005595445 python3.9[52975]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420072.4897892-970-35151221593775/.source.cfg _original_basename=._0xi2g7z follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:34 np0005595445 python3.9[53128]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:34:34 np0005595445 systemd[1]: Reloading Network Manager...
Jan 26 04:34:34 np0005595445 NetworkManager[49073]: <info>  [1769420074.4272] audit: op="reload" arg="0" pid=53132 uid=0 result="success"
Jan 26 04:34:34 np0005595445 NetworkManager[49073]: <info>  [1769420074.4280] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 04:34:34 np0005595445 systemd[1]: Reloaded Network Manager.
Jan 26 04:34:34 np0005595445 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 04:34:34 np0005595445 systemd[1]: session-11.scope: Consumed 53.693s CPU time.
Jan 26 04:34:34 np0005595445 systemd-logind[783]: Session 11 logged out. Waiting for processes to exit.
Jan 26 04:34:34 np0005595445 systemd-logind[783]: Removed session 11.
Jan 26 04:34:40 np0005595445 systemd-logind[783]: New session 12 of user zuul.
Jan 26 04:34:40 np0005595445 systemd[1]: Started Session 12 of User zuul.
Jan 26 04:34:41 np0005595445 python3.9[53318]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:34:42 np0005595445 python3.9[53472]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:34:44 np0005595445 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 04:34:44 np0005595445 python3.9[53667]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:34:45 np0005595445 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 04:34:45 np0005595445 systemd[1]: session-12.scope: Consumed 2.240s CPU time.
Jan 26 04:34:45 np0005595445 systemd-logind[783]: Session 12 logged out. Waiting for processes to exit.
Jan 26 04:34:45 np0005595445 systemd-logind[783]: Removed session 12.
Jan 26 04:34:50 np0005595445 systemd-logind[783]: New session 13 of user zuul.
Jan 26 04:34:50 np0005595445 systemd[1]: Started Session 13 of User zuul.
Jan 26 04:34:51 np0005595445 python3.9[53848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:34:52 np0005595445 python3.9[54002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:34:53 np0005595445 python3.9[54159]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:34:53 np0005595445 python3.9[54243]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:34:56 np0005595445 python3.9[54396]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:34:57 np0005595445 python3.9[54592]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:34:58 np0005595445 python3.9[54744]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:34:58 np0005595445 systemd[1]: var-lib-containers-storage-overlay-compat2610399325-merged.mount: Deactivated successfully.
Jan 26 04:34:58 np0005595445 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1689028226-merged.mount: Deactivated successfully.
Jan 26 04:34:58 np0005595445 podman[54745]: 2026-01-26 09:34:58.293059283 +0000 UTC m=+0.066136182 system refresh
Jan 26 04:34:59 np0005595445 python3.9[54906]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:34:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:34:59 np0005595445 python3.9[55029]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420098.4922214-193-194955250657632/.source.json follow=False _original_basename=podman_network_config.j2 checksum=4900a580bdba2a04d3b8e5150265ba925035f2ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:00 np0005595445 python3.9[55181]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:00 np0005595445 python3.9[55304]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420099.9602005-238-190889517855/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:01 np0005595445 python3.9[55456]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:02 np0005595445 python3.9[55608]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:02 np0005595445 python3.9[55760]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:03 np0005595445 python3.9[55912]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:04 np0005595445 python3.9[56064]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:35:06 np0005595445 python3.9[56217]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:35:07 np0005595445 python3.9[56371]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:35:08 np0005595445 python3.9[56523]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:35:09 np0005595445 python3.9[56675]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:35:10 np0005595445 python3.9[56828]: ansible-service_facts Invoked
Jan 26 04:35:10 np0005595445 network[56845]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:35:10 np0005595445 network[56846]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:35:10 np0005595445 network[56847]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:35:15 np0005595445 python3.9[57301]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:35:19 np0005595445 python3.9[57454]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 04:35:20 np0005595445 python3.9[57608]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:21 np0005595445 python3.9[57734]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420120.0401855-671-249412341059297/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:21 np0005595445 python3.9[57888]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:22 np0005595445 python3.9[58015]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420121.395794-716-179553374390144/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:24 np0005595445 python3.9[58169]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:25 np0005595445 python3.9[58323]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:35:26 np0005595445 python3.9[58407]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:35:29 np0005595445 python3.9[58561]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:35:30 np0005595445 python3.9[58645]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:35:30 np0005595445 systemd[1]: Stopping NTP client/server...
Jan 26 04:35:30 np0005595445 chronyd[791]: chronyd exiting
Jan 26 04:35:30 np0005595445 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 04:35:30 np0005595445 systemd[1]: Stopped NTP client/server.
Jan 26 04:35:30 np0005595445 systemd[1]: Starting NTP client/server...
Jan 26 04:35:30 np0005595445 chronyd[58653]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 04:35:30 np0005595445 chronyd[58653]: Frequency -26.722 +/- 0.705 ppm read from /var/lib/chrony/drift
Jan 26 04:35:30 np0005595445 chronyd[58653]: Loaded seccomp filter (level 2)
Jan 26 04:35:30 np0005595445 systemd[1]: Started NTP client/server.
Jan 26 04:35:30 np0005595445 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 04:35:30 np0005595445 systemd[1]: session-13.scope: Consumed 25.672s CPU time.
Jan 26 04:35:30 np0005595445 systemd-logind[783]: Session 13 logged out. Waiting for processes to exit.
Jan 26 04:35:30 np0005595445 systemd-logind[783]: Removed session 13.
Jan 26 04:35:35 np0005595445 systemd-logind[783]: New session 14 of user zuul.
Jan 26 04:35:35 np0005595445 systemd[1]: Started Session 14 of User zuul.
Jan 26 04:35:36 np0005595445 python3.9[58834]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:37 np0005595445 python3.9[58986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:38 np0005595445 python3.9[59109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420136.9927318-58-154637409406390/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:38 np0005595445 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 04:35:38 np0005595445 systemd[1]: session-14.scope: Consumed 1.594s CPU time.
Jan 26 04:35:38 np0005595445 systemd-logind[783]: Session 14 logged out. Waiting for processes to exit.
Jan 26 04:35:38 np0005595445 systemd-logind[783]: Removed session 14.
Jan 26 04:35:45 np0005595445 systemd-logind[783]: New session 15 of user zuul.
Jan 26 04:35:45 np0005595445 systemd[1]: Started Session 15 of User zuul.
Jan 26 04:35:46 np0005595445 python3.9[59287]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:35:47 np0005595445 python3.9[59443]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:48 np0005595445 python3.9[59618]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:49 np0005595445 python3.9[59741]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769420147.6604342-79-150054414921999/.source.json _original_basename=.35u2wgzl follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:50 np0005595445 python3.9[59893]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:51 np0005595445 python3.9[60016]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420150.1938748-148-206798390406656/.source _original_basename=.wzyp2tud follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:51 np0005595445 python3.9[60168]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:52 np0005595445 python3.9[60320]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:53 np0005595445 python3.9[60443]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420152.2653298-220-90465004562318/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:53 np0005595445 python3.9[60595]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:54 np0005595445 python3.9[60718]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769420153.4184158-220-34844840439307/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:35:55 np0005595445 python3.9[60870]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:55 np0005595445 python3.9[61022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:56 np0005595445 python3.9[61145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420155.225621-331-137851243582868/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:56 np0005595445 python3.9[61297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:35:57 np0005595445 python3.9[61420]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420156.4243226-376-177409186885814/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:35:58 np0005595445 python3.9[61572]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:35:58 np0005595445 systemd[1]: Reloading.
Jan 26 04:35:58 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:35:58 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:35:58 np0005595445 systemd[1]: Reloading.
Jan 26 04:35:59 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:35:59 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:35:59 np0005595445 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 04:35:59 np0005595445 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 04:35:59 np0005595445 python3.9[61799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:00 np0005595445 python3.9[61922]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420159.3804204-445-103471130912335/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:01 np0005595445 python3.9[62074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:01 np0005595445 python3.9[62197]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420160.564569-490-268228402200051/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:02 np0005595445 python3.9[62349]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:36:02 np0005595445 systemd[1]: Reloading.
Jan 26 04:36:02 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:36:02 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:36:02 np0005595445 systemd[1]: Reloading.
Jan 26 04:36:02 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:36:02 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:36:02 np0005595445 systemd[1]: Starting Create netns directory...
Jan 26 04:36:02 np0005595445 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 04:36:02 np0005595445 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 04:36:02 np0005595445 systemd[1]: Finished Create netns directory.
Jan 26 04:36:03 np0005595445 python3.9[62574]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:36:03 np0005595445 network[62591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:36:03 np0005595445 network[62592]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:36:03 np0005595445 network[62593]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:36:08 np0005595445 python3.9[62859]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:36:08 np0005595445 systemd[1]: Reloading.
Jan 26 04:36:08 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:36:08 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:36:08 np0005595445 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 04:36:08 np0005595445 iptables.init[62898]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 04:36:08 np0005595445 iptables.init[62898]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 04:36:08 np0005595445 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 04:36:08 np0005595445 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 04:36:09 np0005595445 python3.9[63094]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:36:10 np0005595445 python3.9[63248]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:36:10 np0005595445 systemd[1]: Reloading.
Jan 26 04:36:10 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:36:10 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:36:10 np0005595445 systemd[1]: Starting Netfilter Tables...
Jan 26 04:36:10 np0005595445 systemd[1]: Finished Netfilter Tables.
Jan 26 04:36:11 np0005595445 python3.9[63441]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:36:12 np0005595445 python3.9[63594]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:13 np0005595445 python3.9[63719]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420172.1761847-697-161030953143602/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:14 np0005595445 python3.9[63872]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:36:14 np0005595445 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 04:36:14 np0005595445 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 04:36:14 np0005595445 python3.9[64028]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:15 np0005595445 python3.9[64180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:16 np0005595445 python3.9[64303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420175.2141292-790-142833297769001/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:17 np0005595445 python3.9[64455]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 04:36:17 np0005595445 systemd[1]: Starting Time & Date Service...
Jan 26 04:36:17 np0005595445 systemd[1]: Started Time & Date Service.
Jan 26 04:36:18 np0005595445 python3.9[64611]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:19 np0005595445 python3.9[64763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:19 np0005595445 python3.9[64886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420178.5664887-895-9522122889112/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:20 np0005595445 python3.9[65038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:20 np0005595445 python3.9[65161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420179.8963096-940-20143349708409/.source.yaml _original_basename=.ihq_hjjm follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:21 np0005595445 python3.9[65313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:22 np0005595445 python3.9[65436]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420181.138754-985-102414540591828/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:22 np0005595445 python3.9[65588]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:36:23 np0005595445 python3.9[65741]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:36:24 np0005595445 python3[65894]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 04:36:25 np0005595445 python3.9[66046]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:25 np0005595445 python3.9[66169]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420184.518629-1102-191018156343412/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:26 np0005595445 python3.9[66321]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:26 np0005595445 python3.9[66444]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420185.8124213-1147-74818182575118/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:27 np0005595445 python3.9[66596]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:28 np0005595445 python3.9[66719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420187.2821174-1192-26973545174162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:28 np0005595445 python3.9[66871]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:29 np0005595445 python3.9[66994]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420188.5483458-1237-26728761888648/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:30 np0005595445 python3.9[67146]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:36:30 np0005595445 python3.9[67269]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420189.8314896-1282-199942946282213/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:31 np0005595445 python3.9[67421]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:32 np0005595445 python3.9[67573]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:36:33 np0005595445 python3.9[67732]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:34 np0005595445 python3.9[67885]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:34 np0005595445 python3.9[68037]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:36 np0005595445 python3.9[68189]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 04:36:36 np0005595445 python3.9[68342]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 04:36:37 np0005595445 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 04:36:37 np0005595445 systemd[1]: session-15.scope: Consumed 34.654s CPU time.
Jan 26 04:36:37 np0005595445 systemd-logind[783]: Session 15 logged out. Waiting for processes to exit.
Jan 26 04:36:37 np0005595445 systemd-logind[783]: Removed session 15.
Jan 26 04:36:43 np0005595445 systemd-logind[783]: New session 16 of user zuul.
Jan 26 04:36:43 np0005595445 systemd[1]: Started Session 16 of User zuul.
Jan 26 04:36:44 np0005595445 python3.9[68523]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 04:36:45 np0005595445 python3.9[68675]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:36:46 np0005595445 python3.9[68827]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:36:47 np0005595445 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 04:36:47 np0005595445 python3.9[68979]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDm+Vrn31pimz+Of4pkRaSS+qazCMrOF2INZ0EZsyoNG5922K2xwdC9F6r4k2L54HPEpDiazPoDsOHQvs1I+CvayNM2D+8hZhvqxZOMimP8b056aM14nht9ADrJUnlaDs57FkgIKQdxma9I0sW8Up3bbLchFOj2grOjH7gRdUBxblzIS01/P5NV8/kPsRXDoCgx+QAxU2nEqyCQd0JXLKoy+v6t+pG7We9wFXXr2z4XmAx7yeU0Y6NsJ1Seies0apLTmfK3HAtj/3LObvZegqVGDFtl5spotTmJdPJUCZhniaUmyYZ4jtIEno86Bf8OhS3NvLsxmNXuJcInlmCHGXDP9FPBrxG+yVB63FUAeyejCXntEyOzXFp8fiCuOVQuqDTWB4UxTRYh3EqVruxhY1taarew/VfsxIAxv6BWsqtvh/6xtRtJ9vTSDHsDTRaOcChfT5BnATFJ+Ilwpve8C4bjRVdlStH+99TgtNPOg2Fxf8scyIHInM9c4Yn7g8YTiyk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICrJdFptF1rp2hjeKcc0nSEhHvDtAYFU4gfqZN6U+WTb#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNa2lKVjuYCljd0rl1qDkTP3ZoTV9fkbcXvtxSizwygrF6dU+RWdeB3LOkT5U/2GTJuWvOqxJBc3Y1d0b3Dj5Do=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyi0WEBS9Gc5Xay4vqFSdv0cJGdtezg+CrNF/vjEeF3l4EhpAAj7XRLEhEU1kz0DDKkzclG65hBNPO4/9cfzEa31EsSmzOqjqZp5ri20HVDkiZlUTTklhrbJGydUw6mcy+rIN1qsUugVHwkA9ufZLvzm9wvljzL+WPt1o41GT42NdNzyfPfnqf7HMDziNUNUUZjqsoy+DQnlMl3c3NHiGysPJ6IssbLBCFzPdBHpEYmR8b44qlJEhx3RYWl3QLcXAyoK7VpPdFO4ltMT+0KVVbLO9IUrocCQ4HfafPn/mV1Rq3phDWvCTRfRo07Mu4Oc4XBu+RIk9tt1WTIdT/ZusPUNSkFgprdU9zFIHLR0KyIX4qRSuWBeB20Ic5pvkRvNtwLB8lPt4NVi7bmun6moO8nu6cOjJ61CCAobDSEL/Z2cG3ADucjCSKtWLM0eSdt6T71NmULMhdB8ljIK4em/NCf/qZWjYr70WKyIZ9b8N5lDO8NF1tbPJyu+O0ebq/JN8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAaib//yQ1QyvWijjfui4OBtTtMt7Dos+hlx8rucs2Tn#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7YXsQWyEQWSdy5tcEAtltn11CwuaqW/S8S3OB1580hTlcLZWLPDHbzSwNDf13HBG9wgLFgmueLB8U6J7wvvcM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0TZpcPGqQPKNdLKsJSWd1uRV3wOVDiIo3gYwVWAuH5m+Wvpw34ZI+6+d4y3DWMqDRZVWAVV0NNFB+b4MQeivx4S7KMCvBctzJ6VIyUDL5NZrwys0sYPH+33ncdZd6C8LrfCvIct+DbWCx72RQ+G0yRbYK1r/m5+dzW2411NqWn8kJkBUeLJIqT2vhFoNpO8NaWSVlWEgl5YunYEPS4v5NSM88ke6Gzc5X5sjxsz65REj6/1BXsA+quwcTAe/KC1/1Rr2cufefwf0uayM6sGuUDATjWIw36YqUeL9wc/IDdIEFEvj2hr/v+r6laaKMidOYJXBiQwIWpgWCOosSj4vrPQmDfqjOa8sAn7yWPVgxyARccavEO89zV2lpFcYTdqegPxjB90lD3Q1pMU6veJUWTRo0LAZ6n9rsRBgF0Mhr75T32Lbqf3KBro6/nPrp1XCD08mNv2cEYwp+put7vwvHzN1nPztqMsIDAMJMupwI+Buyr3xCPHe3hcAavahF+YM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINbUUMKlV4hksqDn2YVVAHPCHip80h7zj0rReM94Ja2l#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFtD30BOt1BlR6BYm8DU7sxF5fAzZ/aciKetiRsXWlbsXS3Z4mVG1ZAF9AhArV+OaapsLeaQFybIC0e2fudJfos=#012 create=True mode=0644 path=/tmp/ansible.5wv34se6 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:48 np0005595445 python3.9[69133]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.5wv34se6' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:36:49 np0005595445 python3.9[69287]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.5wv34se6 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:36:49 np0005595445 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 04:36:49 np0005595445 systemd[1]: session-16.scope: Consumed 3.334s CPU time.
Jan 26 04:36:49 np0005595445 systemd-logind[783]: Session 16 logged out. Waiting for processes to exit.
Jan 26 04:36:49 np0005595445 systemd-logind[783]: Removed session 16.
Jan 26 04:36:57 np0005595445 systemd-logind[783]: New session 17 of user zuul.
Jan 26 04:36:57 np0005595445 systemd[1]: Started Session 17 of User zuul.
Jan 26 04:36:58 np0005595445 python3.9[69469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:36:59 np0005595445 python3.9[69625]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 04:37:00 np0005595445 python3.9[69779]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:37:01 np0005595445 python3.9[69934]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:37:02 np0005595445 python3.9[70087]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:37:02 np0005595445 python3.9[70241]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:37:03 np0005595445 python3.9[70396]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:37:04 np0005595445 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 04:37:04 np0005595445 systemd[1]: session-17.scope: Consumed 4.472s CPU time.
Jan 26 04:37:04 np0005595445 systemd-logind[783]: Session 17 logged out. Waiting for processes to exit.
Jan 26 04:37:04 np0005595445 systemd-logind[783]: Removed session 17.
Jan 26 04:37:09 np0005595445 systemd-logind[783]: New session 18 of user zuul.
Jan 26 04:37:09 np0005595445 systemd[1]: Started Session 18 of User zuul.
Jan 26 04:37:10 np0005595445 python3.9[70575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:37:11 np0005595445 python3.9[70731]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:37:12 np0005595445 python3.9[70815]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 04:37:14 np0005595445 python3.9[70966]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:37:15 np0005595445 python3.9[71117]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 04:37:16 np0005595445 python3.9[71267]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:37:16 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:37:16 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:37:17 np0005595445 python3.9[71418]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:37:17 np0005595445 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 04:37:17 np0005595445 systemd[1]: session-18.scope: Consumed 5.779s CPU time.
Jan 26 04:37:17 np0005595445 systemd-logind[783]: Session 18 logged out. Waiting for processes to exit.
Jan 26 04:37:17 np0005595445 systemd-logind[783]: Removed session 18.
Jan 26 04:37:25 np0005595445 systemd-logind[783]: New session 19 of user zuul.
Jan 26 04:37:25 np0005595445 systemd[1]: Started Session 19 of User zuul.
Jan 26 04:37:31 np0005595445 python3[72184]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:37:33 np0005595445 python3[72279]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 04:37:35 np0005595445 python3[72307]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 04:37:35 np0005595445 python3[72334]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:37:35 np0005595445 kernel: loop: module loaded
Jan 26 04:37:35 np0005595445 kernel: loop3: detected capacity change from 0 to 41943040
Jan 26 04:37:35 np0005595445 python3[72369]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:37:36 np0005595445 lvm[72372]: PV /dev/loop3 not used.
Jan 26 04:37:36 np0005595445 lvm[72374]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:37:36 np0005595445 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 26 04:37:36 np0005595445 lvm[72382]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 26 04:37:36 np0005595445 lvm[72384]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:37:36 np0005595445 lvm[72384]: VG ceph_vg0 finished
Jan 26 04:37:36 np0005595445 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 26 04:37:36 np0005595445 python3[72462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 04:37:37 np0005595445 python3[72535]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769420256.3965585-36916-47310311552383/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:37:37 np0005595445 python3[72585]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:37:37 np0005595445 systemd[1]: Reloading.
Jan 26 04:37:37 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:37:37 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:37:38 np0005595445 systemd[1]: Starting Ceph OSD losetup...
Jan 26 04:37:38 np0005595445 bash[72624]: /dev/loop3: [64513]:4328448 (/var/lib/ceph-osd-0.img)
Jan 26 04:37:38 np0005595445 systemd[1]: Finished Ceph OSD losetup.
Jan 26 04:37:38 np0005595445 lvm[72625]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:37:38 np0005595445 lvm[72625]: VG ceph_vg0 finished
Jan 26 04:37:39 np0005595445 chronyd[58653]: Selected source 167.160.187.12 (pool.ntp.org)
Jan 26 04:37:40 np0005595445 python3[72650]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:39:13 np0005595445 systemd[1]: Created slice User Slice of UID 42477.
Jan 26 04:39:13 np0005595445 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 26 04:39:13 np0005595445 systemd-logind[783]: New session 20 of user ceph-admin.
Jan 26 04:39:13 np0005595445 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 26 04:39:13 np0005595445 systemd[1]: Starting User Manager for UID 42477...
Jan 26 04:39:13 np0005595445 systemd[72713]: Queued start job for default target Main User Target.
Jan 26 04:39:13 np0005595445 systemd[72713]: Created slice User Application Slice.
Jan 26 04:39:13 np0005595445 systemd[72713]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 04:39:13 np0005595445 systemd[72713]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 04:39:13 np0005595445 systemd[72713]: Reached target Paths.
Jan 26 04:39:13 np0005595445 systemd[72713]: Reached target Timers.
Jan 26 04:39:13 np0005595445 systemd[72713]: Starting D-Bus User Message Bus Socket...
Jan 26 04:39:13 np0005595445 systemd[72713]: Starting Create User's Volatile Files and Directories...
Jan 26 04:39:13 np0005595445 systemd[72713]: Finished Create User's Volatile Files and Directories.
Jan 26 04:39:13 np0005595445 systemd[72713]: Listening on D-Bus User Message Bus Socket.
Jan 26 04:39:13 np0005595445 systemd[72713]: Reached target Sockets.
Jan 26 04:39:13 np0005595445 systemd[72713]: Reached target Basic System.
Jan 26 04:39:13 np0005595445 systemd[72713]: Reached target Main User Target.
Jan 26 04:39:13 np0005595445 systemd[72713]: Startup finished in 116ms.
Jan 26 04:39:13 np0005595445 systemd[1]: Started User Manager for UID 42477.
Jan 26 04:39:13 np0005595445 systemd[1]: Started Session 20 of User ceph-admin.
Jan 26 04:39:13 np0005595445 systemd-logind[783]: New session 22 of user ceph-admin.
Jan 26 04:39:13 np0005595445 systemd[1]: Started Session 22 of User ceph-admin.
Jan 26 04:39:13 np0005595445 systemd-logind[783]: New session 23 of user ceph-admin.
Jan 26 04:39:13 np0005595445 systemd[1]: Started Session 23 of User ceph-admin.
Jan 26 04:39:14 np0005595445 systemd-logind[783]: New session 24 of user ceph-admin.
Jan 26 04:39:14 np0005595445 systemd[1]: Started Session 24 of User ceph-admin.
Jan 26 04:39:14 np0005595445 systemd-logind[783]: New session 25 of user ceph-admin.
Jan 26 04:39:14 np0005595445 systemd[1]: Started Session 25 of User ceph-admin.
Jan 26 04:39:14 np0005595445 systemd-logind[783]: New session 26 of user ceph-admin.
Jan 26 04:39:14 np0005595445 systemd[1]: Started Session 26 of User ceph-admin.
Jan 26 04:39:15 np0005595445 systemd-logind[783]: New session 27 of user ceph-admin.
Jan 26 04:39:15 np0005595445 systemd[1]: Started Session 27 of User ceph-admin.
Jan 26 04:39:15 np0005595445 systemd-logind[783]: New session 28 of user ceph-admin.
Jan 26 04:39:15 np0005595445 systemd[1]: Started Session 28 of User ceph-admin.
Jan 26 04:39:15 np0005595445 systemd-logind[783]: New session 29 of user ceph-admin.
Jan 26 04:39:15 np0005595445 systemd[1]: Started Session 29 of User ceph-admin.
Jan 26 04:39:16 np0005595445 systemd-logind[783]: New session 30 of user ceph-admin.
Jan 26 04:39:16 np0005595445 systemd[1]: Started Session 30 of User ceph-admin.
Jan 26 04:39:17 np0005595445 systemd-logind[783]: New session 31 of user ceph-admin.
Jan 26 04:39:17 np0005595445 systemd[1]: Started Session 31 of User ceph-admin.
Jan 26 04:39:17 np0005595445 systemd-logind[783]: New session 32 of user ceph-admin.
Jan 26 04:39:17 np0005595445 systemd[1]: Started Session 32 of User ceph-admin.
Jan 26 04:39:18 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:18 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:19 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:19 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:19 np0005595445 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73285 (sysctl)
Jan 26 04:39:19 np0005595445 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 04:39:19 np0005595445 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 04:39:20 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:20 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:39:23 np0005595445 systemd[1]: var-lib-containers-storage-overlay-compat2388042479-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.180490209 +0000 UTC m=+40.530093498 container create eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2832144697-merged.mount: Deactivated successfully.
Jan 26 04:40:01 np0005595445 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 04:40:01 np0005595445 systemd[1]: Started libpod-conmon-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope.
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.164690848 +0000 UTC m=+40.514294167 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:01 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.293158237 +0000 UTC m=+40.642761546 container init eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.303372259 +0000 UTC m=+40.652975548 container start eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.307416565 +0000 UTC m=+40.657019854 container attach eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:01 np0005595445 focused_moser[73529]: 167 167
Jan 26 04:40:01 np0005595445 systemd[1]: libpod-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope: Deactivated successfully.
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.313041415 +0000 UTC m=+40.662644724 container died eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay-570d2b2daafeebf8abf9f02a6732ab3a7c077ce9ae325ec0f26b735608f32535-merged.mount: Deactivated successfully.
Jan 26 04:40:01 np0005595445 podman[73459]: 2026-01-26 09:40:01.505525124 +0000 UTC m=+40.855128413 container remove eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_moser, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 04:40:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:01 np0005595445 systemd[1]: libpod-conmon-eb7e0a79194f723f10e215350402c36bea81ec5855a2e789879af3dc7ea0b036.scope: Deactivated successfully.
Jan 26 04:40:01 np0005595445 podman[73555]: 2026-01-26 09:40:01.696447037 +0000 UTC m=+0.051121592 container create 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 26 04:40:01 np0005595445 systemd[1]: Started libpod-conmon-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope.
Jan 26 04:40:01 np0005595445 podman[73555]: 2026-01-26 09:40:01.6723856 +0000 UTC m=+0.027060185 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:01 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:01 np0005595445 podman[73555]: 2026-01-26 09:40:01.922872266 +0000 UTC m=+0.277546881 container init 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 26 04:40:01 np0005595445 podman[73555]: 2026-01-26 09:40:01.930361059 +0000 UTC m=+0.285035614 container start 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 04:40:01 np0005595445 podman[73555]: 2026-01-26 09:40:01.956766694 +0000 UTC m=+0.311441259 container attach 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]: [
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:    {
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "available": false,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "being_replaced": false,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "ceph_device_lvm": false,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "lsm_data": {},
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "lvs": [],
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "path": "/dev/sr0",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "rejected_reasons": [
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "Has a FileSystem",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "Insufficient space (<5GB)"
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        ],
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        "sys_api": {
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "actuators": null,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "device_nodes": [
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:                "sr0"
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            ],
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "devname": "sr0",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "human_readable_size": "482.00 KB",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "id_bus": "ata",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "model": "QEMU DVD-ROM",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "nr_requests": "2",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "parent": "/dev/sr0",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "partitions": {},
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "path": "/dev/sr0",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "removable": "1",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "rev": "2.5+",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "ro": "0",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "rotational": "1",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "sas_address": "",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "sas_device_handle": "",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "scheduler_mode": "mq-deadline",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "sectors": 0,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "sectorsize": "2048",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "size": 493568.0,
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "support_discard": "2048",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "type": "disk",
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:            "vendor": "QEMU"
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:        }
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]:    }
Jan 26 04:40:02 np0005595445 upbeat_wilbur[73572]: ]
Jan 26 04:40:02 np0005595445 systemd[1]: libpod-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope: Deactivated successfully.
Jan 26 04:40:02 np0005595445 conmon[73572]: conmon 28766eaf7fcc0b007871 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope/container/memory.events
Jan 26 04:40:02 np0005595445 podman[73555]: 2026-01-26 09:40:02.750295811 +0000 UTC m=+1.104970366 container died 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:02 np0005595445 systemd[1]: var-lib-containers-storage-overlay-01fd24e34044de2a6ca969d85672345137a3be638b2b6f9aa87d61af3cffd695-merged.mount: Deactivated successfully.
Jan 26 04:40:03 np0005595445 podman[73555]: 2026-01-26 09:40:03.052948566 +0000 UTC m=+1.407623121 container remove 28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_wilbur, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:03 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:03 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:03 np0005595445 systemd[1]: libpod-conmon-28766eaf7fcc0b0078715f46e702e8033efafd4b95ab97f292728459f1cbbc56.scope: Deactivated successfully.
Jan 26 04:40:05 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:05 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.699775493 +0000 UTC m=+0.035620739 container create d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:40:05 np0005595445 systemd[1]: Started libpod-conmon-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope.
Jan 26 04:40:05 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.769267508 +0000 UTC m=+0.105112784 container init d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.776166365 +0000 UTC m=+0.112011611 container start d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 04:40:05 np0005595445 confident_hermann[75453]: 167 167
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.779981873 +0000 UTC m=+0.115827139 container attach d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:40:05 np0005595445 systemd[1]: libpod-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope: Deactivated successfully.
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.780769316 +0000 UTC m=+0.116614582 container died d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.685651089 +0000 UTC m=+0.021496335 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:05 np0005595445 podman[75436]: 2026-01-26 09:40:05.814012586 +0000 UTC m=+0.149857842 container remove d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 26 04:40:05 np0005595445 systemd[1]: libpod-conmon-d6d787e469710293cdc608f1a0c05c6642a774a506f4548c29a13a0c4cf05ace.scope: Deactivated successfully.
Jan 26 04:40:05 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:05 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:05 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:06 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:06 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:06 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:06 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:06 np0005595445 systemd[1]: Reached target All Ceph clusters and services.
Jan 26 04:40:06 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:06 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:06 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:06 np0005595445 systemd[1]: Reached target Ceph cluster 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:40:06 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:06 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:06 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:06 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:07 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:07 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:07 np0005595445 systemd[1]: Created slice Slice /system/ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:40:07 np0005595445 systemd[1]: Reached target System Time Set.
Jan 26 04:40:07 np0005595445 systemd[1]: Reached target System Time Synchronized.
Jan 26 04:40:07 np0005595445 systemd[1]: Starting Ceph crash.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:40:07 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:07 np0005595445 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 04:40:07 np0005595445 podman[75706]: 2026-01-26 09:40:07.471240494 +0000 UTC m=+0.052963234 container create 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:07 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:07 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:07 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1865e4ebf94f5b6e666b769306a130094ae8b4bb66a9e0288e50127b753ab7b/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:07 np0005595445 podman[75706]: 2026-01-26 09:40:07.441019941 +0000 UTC m=+0.022742701 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:07 np0005595445 podman[75706]: 2026-01-26 09:40:07.541934993 +0000 UTC m=+0.123657753 container init 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 26 04:40:07 np0005595445 podman[75706]: 2026-01-26 09:40:07.548496902 +0000 UTC m=+0.130219642 container start 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:07 np0005595445 bash[75706]: 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40
Jan 26 04:40:07 np0005595445 systemd[1]: Started Ceph crash.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.707+0000 7f4fd062c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.707+0000 7f4fd062c640 -1 AuthRegistry(0x7f4fc80698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.708+0000 7f4fd062c640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.708+0000 7f4fd062c640 -1 AuthRegistry(0x7f4fd062aff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.711+0000 7f4fce3a1640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: 2026-01-26T09:40:07.711+0000 7f4fd062c640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 26 04:40:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1[75722]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.613061471 +0000 UTC m=+0.045089579 container create e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 04:40:08 np0005595445 systemd[1]: Started libpod-conmon-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope.
Jan 26 04:40:08 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.678435888 +0000 UTC m=+0.110464026 container init e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.685553032 +0000 UTC m=+0.117581140 container start e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.592554954 +0000 UTC m=+0.024583112 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.689522105 +0000 UTC m=+0.121550213 container attach e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:08 np0005595445 practical_johnson[75848]: 167 167
Jan 26 04:40:08 np0005595445 systemd[1]: libpod-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope: Deactivated successfully.
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.690874143 +0000 UTC m=+0.122902261 container died e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:08 np0005595445 systemd[1]: var-lib-containers-storage-overlay-e4a740afea6bd451727603c5e72b1d9fd896816c7a0fd62f19a0295c57db0f6a-merged.mount: Deactivated successfully.
Jan 26 04:40:08 np0005595445 podman[75832]: 2026-01-26 09:40:08.724410331 +0000 UTC m=+0.156438439 container remove e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=practical_johnson, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 26 04:40:08 np0005595445 systemd[1]: libpod-conmon-e0d6b2bed70b5147af94a114cafbae77823430339c2ec1a07afc187f657b2757.scope: Deactivated successfully.
Jan 26 04:40:08 np0005595445 podman[75874]: 2026-01-26 09:40:08.877821393 +0000 UTC m=+0.046131208 container create 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True)
Jan 26 04:40:08 np0005595445 systemd[1]: Started libpod-conmon-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope.
Jan 26 04:40:08 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:08 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:08 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:08 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:08 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:08 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:08 np0005595445 podman[75874]: 2026-01-26 09:40:08.858171682 +0000 UTC m=+0.026481517 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:08 np0005595445 podman[75874]: 2026-01-26 09:40:08.959858436 +0000 UTC m=+0.128168261 container init 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:08 np0005595445 podman[75874]: 2026-01-26 09:40:08.970799419 +0000 UTC m=+0.139109234 container start 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 26 04:40:08 np0005595445 podman[75874]: 2026-01-26 09:40:08.974432603 +0000 UTC m=+0.142742418 container attach 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: --> passed data devices: 0 physical, 1 LVM
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 8ed5e8fe-4547-4eab-be95-05fd5f9f3f95
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:09 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Jan 26 04:40:09 np0005595445 lvm[75953]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:40:09 np0005595445 lvm[75953]: VG ceph_vg0 finished
Jan 26 04:40:10 np0005595445 heuristic_swanson[75890]: stderr: got monmap epoch 1
Jan 26 04:40:10 np0005595445 heuristic_swanson[75890]: --> Creating keyring file for osd.1
Jan 26 04:40:10 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Jan 26 04:40:10 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Jan 26 04:40:10 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 8ed5e8fe-4547-4eab-be95-05fd5f9f3f95 --setuser ceph --setgroup ceph
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: stderr: 2026-01-26T09:40:10.530+0000 7ffb5d790740 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) No valid bdev label found
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: stderr: 2026-01-26T09:40:10.791+0000 7ffb5d790740 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 04:40:13 np0005595445 heuristic_swanson[75890]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 26 04:40:13 np0005595445 systemd[1]: libpod-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Deactivated successfully.
Jan 26 04:40:13 np0005595445 systemd[1]: libpod-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Consumed 2.097s CPU time.
Jan 26 04:40:13 np0005595445 podman[75874]: 2026-01-26 09:40:13.508718545 +0000 UTC m=+4.677028360 container died 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default)
Jan 26 04:40:13 np0005595445 systemd[1]: var-lib-containers-storage-overlay-144e48bf0894d5fd88832d93b6b5fb32ee1de11ba8269e368d62938cc3842a98-merged.mount: Deactivated successfully.
Jan 26 04:40:13 np0005595445 podman[75874]: 2026-01-26 09:40:13.557619942 +0000 UTC m=+4.725929757 container remove 387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_swanson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 04:40:13 np0005595445 systemd[1]: libpod-conmon-387fe622ba0789b52550a7f5fe9f0ef5dd6a85cb3ee600d6c2f1bd86ac01db23.scope: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.204546322 +0000 UTC m=+0.041633931 container create 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:14 np0005595445 systemd[1]: Started libpod-conmon-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope.
Jan 26 04:40:14 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.186080254 +0000 UTC m=+0.023167893 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.295808578 +0000 UTC m=+0.132896197 container init 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.305857945 +0000 UTC m=+0.142945544 container start 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.309107108 +0000 UTC m=+0.146194737 container attach 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 26 04:40:14 np0005595445 nice_wilson[76987]: 167 167
Jan 26 04:40:14 np0005595445 systemd[1]: libpod-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.31197579 +0000 UTC m=+0.149063389 container died 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 04:40:14 np0005595445 systemd[1]: var-lib-containers-storage-overlay-2d001f63c1c224f381f70e1f0a04c610b99716937520b8ee5f518b533cee79c5-merged.mount: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[76971]: 2026-01-26 09:40:14.345204359 +0000 UTC m=+0.182291958 container remove 5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nice_wilson, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:14 np0005595445 systemd[1]: libpod-conmon-5ea842917222bb1e3fe2ce292e31c4864b2fb48be6833e5fdc8098a877367f98.scope: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.534912669 +0000 UTC m=+0.050059661 container create 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 04:40:14 np0005595445 systemd[1]: Started libpod-conmon-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope.
Jan 26 04:40:14 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:14 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:14 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:14 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:14 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.606490923 +0000 UTC m=+0.121637935 container init 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.514472084 +0000 UTC m=+0.029619136 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.614313986 +0000 UTC m=+0.129460978 container start 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.620292357 +0000 UTC m=+0.135439499 container attach 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]: {
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:    "1": [
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:        {
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "devices": [
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "/dev/loop3"
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            ],
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "lv_name": "ceph_lv0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "lv_size": "21470642176",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1a70b85d-e3fd-5814-8a6a-37ea00fcae30,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=8ed5e8fe-4547-4eab-be95-05fd5f9f3f95,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "lv_uuid": "PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "name": "ceph_lv0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "tags": {
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.block_uuid": "PaSr8h-YOLM-efKX-5RlJ-3Pqs-GEP9-6iJAZw",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.cephx_lockbox_secret": "",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.cluster_fsid": "1a70b85d-e3fd-5814-8a6a-37ea00fcae30",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.cluster_name": "ceph",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.crush_device_class": "",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.encrypted": "0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.osd_fsid": "8ed5e8fe-4547-4eab-be95-05fd5f9f3f95",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.osd_id": "1",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.type": "block",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.vdo": "0",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:                "ceph.with_tpm": "0"
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            },
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "type": "block",
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:            "vg_name": "ceph_vg0"
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:        }
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]:    ]
Jan 26 04:40:14 np0005595445 infallible_sanderson[77029]: }
Jan 26 04:40:14 np0005595445 systemd[1]: libpod-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.932815155 +0000 UTC m=+0.447962147 container died 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 04:40:14 np0005595445 systemd[1]: var-lib-containers-storage-overlay-2f713212828322b28c4ee7d8dcc6948ead68b259fcac0e41b7ae9f36997ffd63-merged.mount: Deactivated successfully.
Jan 26 04:40:14 np0005595445 podman[77012]: 2026-01-26 09:40:14.975794992 +0000 UTC m=+0.490941974 container remove 45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 04:40:14 np0005595445 systemd[1]: libpod-conmon-45ac69645ecb663e0f1e522095327c43adcaa3e74c1157414913537740df98df.scope: Deactivated successfully.
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.547318088 +0000 UTC m=+0.050369350 container create 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:40:15 np0005595445 systemd[1]: Started libpod-conmon-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope.
Jan 26 04:40:15 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.529662314 +0000 UTC m=+0.032713596 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.626905441 +0000 UTC m=+0.129956743 container init 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.635324631 +0000 UTC m=+0.138375893 container start 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.638836982 +0000 UTC m=+0.141888244 container attach 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:15 np0005595445 modest_bhabha[77159]: 167 167
Jan 26 04:40:15 np0005595445 systemd[1]: libpod-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope: Deactivated successfully.
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.644003659 +0000 UTC m=+0.147054941 container died 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:15 np0005595445 systemd[1]: var-lib-containers-storage-overlay-2fdcb90caf1d2ff9950e0425d3100e8d83b71cbb46052cad049b7a4572a60abd-merged.mount: Deactivated successfully.
Jan 26 04:40:15 np0005595445 podman[77142]: 2026-01-26 09:40:15.676304432 +0000 UTC m=+0.179355694 container remove 2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_bhabha, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:40:15 np0005595445 systemd[1]: libpod-conmon-2a01c745f4471b87d52aae05b93a441dd06e70e5b0bbe3f0ad9984f6e5884df1.scope: Deactivated successfully.
Jan 26 04:40:15 np0005595445 podman[77190]: 2026-01-26 09:40:15.922810444 +0000 UTC m=+0.044631776 container create c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:15 np0005595445 systemd[1]: Started libpod-conmon-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope.
Jan 26 04:40:15 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:15 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:15 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:15 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:16 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:16 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:15.903201713 +0000 UTC m=+0.025023065 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:16.011869408 +0000 UTC m=+0.133690760 container init c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:16.018387374 +0000 UTC m=+0.140208706 container start c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:16.021577435 +0000 UTC m=+0.143398767 container attach c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 26 04:40:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 26 04:40:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]:                            [--no-systemd] [--no-tmpfs]
Jan 26 04:40:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test[77206]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 26 04:40:16 np0005595445 systemd[1]: libpod-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope: Deactivated successfully.
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:16.230679338 +0000 UTC m=+0.352500670 container died c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:16 np0005595445 systemd[1]: var-lib-containers-storage-overlay-a8d364819c7b47ea952b3f1eed120910f7cb7378b0367df5916e11fbe3963c3f-merged.mount: Deactivated successfully.
Jan 26 04:40:16 np0005595445 podman[77190]: 2026-01-26 09:40:16.271929287 +0000 UTC m=+0.393750619 container remove c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True)
Jan 26 04:40:16 np0005595445 systemd[1]: libpod-conmon-c54918519b985f75d59614ad7b4bd790b1e0c32713154a8bac75d4e489d274ff.scope: Deactivated successfully.
Jan 26 04:40:16 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:16 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:16 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:16 np0005595445 systemd[1]: Reloading.
Jan 26 04:40:16 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:40:16 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:40:17 np0005595445 systemd[1]: Starting Ceph osd.1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:40:17 np0005595445 podman[77367]: 2026-01-26 09:40:17.300786536 +0000 UTC m=+0.042329600 container create 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:17 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:17 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:17 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:17 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:17 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:17 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:17 np0005595445 podman[77367]: 2026-01-26 09:40:17.27958834 +0000 UTC m=+0.021131424 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:17 np0005595445 podman[77367]: 2026-01-26 09:40:17.389245472 +0000 UTC m=+0.130788556 container init 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 04:40:17 np0005595445 podman[77367]: 2026-01-26 09:40:17.394960856 +0000 UTC m=+0.136503920 container start 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 26 04:40:17 np0005595445 podman[77367]: 2026-01-26 09:40:17.398555949 +0000 UTC m=+0.140099013 container attach 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:17 np0005595445 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:17 np0005595445 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:18 np0005595445 lvm[77463]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:40:18 np0005595445 lvm[77463]: VG ceph_vg0 finished
Jan 26 04:40:18 np0005595445 lvm[77467]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:40:18 np0005595445 lvm[77467]: VG ceph_vg0 finished
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:18 np0005595445 bash[77367]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:18 np0005595445 bash[77367]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Jan 26 04:40:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate[77382]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 04:40:18 np0005595445 bash[77367]: --> ceph-volume lvm activate successful for osd ID: 1
Jan 26 04:40:18 np0005595445 systemd[1]: libpod-0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593.scope: Deactivated successfully.
Jan 26 04:40:18 np0005595445 systemd[1]: libpod-0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593.scope: Consumed 1.480s CPU time.
Jan 26 04:40:18 np0005595445 podman[77367]: 2026-01-26 09:40:18.680283311 +0000 UTC m=+1.421826385 container died 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:18 np0005595445 systemd[1]: var-lib-containers-storage-overlay-27b495c530188627c4776c9822828e7d0693794e2ce4752a123cbdcadc59ecd2-merged.mount: Deactivated successfully.
Jan 26 04:40:18 np0005595445 podman[77367]: 2026-01-26 09:40:18.726205083 +0000 UTC m=+1.467748147 container remove 0ac1e5649ab9133aa599199960d1ac5c39c443429ae70878c69dac54c116a593 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:18 np0005595445 podman[77613]: 2026-01-26 09:40:18.93825711 +0000 UTC m=+0.041929499 container create ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:18 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:18 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:18 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:18 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:18 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171b640df2369381bfad10f568be89f728f5dc5625b3469beef0a366f0bb7948/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:19 np0005595445 podman[77613]: 2026-01-26 09:40:18.999628443 +0000 UTC m=+0.103300862 container init ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 26 04:40:19 np0005595445 podman[77613]: 2026-01-26 09:40:19.005660295 +0000 UTC m=+0.109332694 container start ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 26 04:40:19 np0005595445 bash[77613]: ba4e5e4834ef65c29e4703b2ab0f0e5713b41e16b2137b2982ada209763b448d
Jan 26 04:40:19 np0005595445 podman[77613]: 2026-01-26 09:40:18.921590904 +0000 UTC m=+0.025263333 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:19 np0005595445 systemd[1]: Started Ceph osd.1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: pidfile_write: ignore empty --pid-file
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1c00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:19 np0005595445 podman[77742]: 2026-01-26 09:40:19.639742698 +0000 UTC m=+0.022492533 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:19 np0005595445 ceph-osd[77632]: bdev(0x55d16e9a1800 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.051102179 +0000 UTC m=+0.433851994 container create 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:40:20 np0005595445 systemd[1]: Started libpod-conmon-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope.
Jan 26 04:40:20 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.149593862 +0000 UTC m=+0.532343687 container init 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.160911075 +0000 UTC m=+0.543660890 container start 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.165194837 +0000 UTC m=+0.547944672 container attach 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:20 np0005595445 pensive_benz[77763]: 167 167
Jan 26 04:40:20 np0005595445 systemd[1]: libpod-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope: Deactivated successfully.
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.169754348 +0000 UTC m=+0.552504193 container died 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Jan 26 04:40:20 np0005595445 systemd[1]: var-lib-containers-storage-overlay-412fa71ff8050b5792a2e40fb88ad19918a3b60985c8c2b5730d6f8b88b3360b-merged.mount: Deactivated successfully.
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: load: jerasure load: lrc 
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:20 np0005595445 podman[77742]: 2026-01-26 09:40:20.218718607 +0000 UTC m=+0.601468422 container remove 2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_benz, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:20 np0005595445 systemd[1]: libpod-conmon-2833edf786340ea0d93d286a376f4d5550802d5be01b3a14d969f6af894ac500.scope: Deactivated successfully.
Jan 26 04:40:20 np0005595445 podman[77791]: 2026-01-26 09:40:20.399576353 +0000 UTC m=+0.060000805 container create 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:40:20 np0005595445 systemd[1]: Started libpod-conmon-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope.
Jan 26 04:40:20 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:20 np0005595445 podman[77791]: 2026-01-26 09:40:20.379804988 +0000 UTC m=+0.040229440 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:20 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:20 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:20 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:20 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:20 np0005595445 podman[77791]: 2026-01-26 09:40:20.495939395 +0000 UTC m=+0.156363847 container init 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:20 np0005595445 podman[77791]: 2026-01-26 09:40:20.502921115 +0000 UTC m=+0.163345547 container start 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Jan 26 04:40:20 np0005595445 podman[77791]: 2026-01-26 09:40:20.506363443 +0000 UTC m=+0.166787895 container attach 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:20 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:21 np0005595445 lvm[77897]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:40:21 np0005595445 lvm[77897]: VG ceph_vg0 finished
Jan 26 04:40:21 np0005595445 ecstatic_lamport[77807]: {}
Jan 26 04:40:21 np0005595445 systemd[1]: libpod-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Deactivated successfully.
Jan 26 04:40:21 np0005595445 podman[77791]: 2026-01-26 09:40:21.247672369 +0000 UTC m=+0.908096821 container died 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:40:21 np0005595445 systemd[1]: libpod-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Consumed 1.220s CPU time.
Jan 26 04:40:21 np0005595445 systemd[1]: var-lib-containers-storage-overlay-6abc23e562a823c5ab5fd0e92fc563387bc662de40b5132122a694a67974f4c5-merged.mount: Deactivated successfully.
Jan 26 04:40:21 np0005595445 podman[77791]: 2026-01-26 09:40:21.289556755 +0000 UTC m=+0.949981187 container remove 850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_lamport, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:40:21 np0005595445 systemd[1]: libpod-conmon-850cccbb0d31b69b90968858daecf9dc51ce737d873ebc97405868154a952b7b.scope: Deactivated successfully.
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83cc00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount shared_bdev_used = 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: RocksDB version: 7.9.2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Git sha 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DB SUMMARY
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DB Session ID:  GDOX11IZ247TJWEOACQM
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: CURRENT file:  CURRENT
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.error_if_exists: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.create_if_missing: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                     Options.env: 0x55d16f80ddc0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                Options.info_log: 0x55d16f8117a0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.statistics: (nil)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.use_fsync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.db_log_dir: 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.write_buffer_manager: 0x55d16f908a00
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.unordered_write: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.row_cache: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.wal_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.two_write_queues: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.wal_compression: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.atomic_flush: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_background_jobs: 4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_background_compactions: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_subcompactions: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.max_open_files: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Compression algorithms supported:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZSTD supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kXpressCompression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZlibCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811b80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b9759fc-282b-40dc-a20d-ecaf65ebba52
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421636109, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421636410, "job": 1, "event": "recovery_finished"}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: freelist init
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: freelist _read_cfg
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs umount
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) close
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bdev(0x55d16f83d000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluefs mount shared_bdev_used = 4718592
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: RocksDB version: 7.9.2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Git sha 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DB SUMMARY
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DB Session ID:  GDOX11IZ247TJWEOACQN
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: CURRENT file:  CURRENT
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.error_if_exists: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.create_if_missing: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                     Options.env: 0x55d16f9ac2a0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                Options.info_log: 0x55d16f811940
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.statistics: (nil)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.use_fsync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.db_log_dir: 
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                                 Options.wal_dir: db.wal
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.write_buffer_manager: 0x55d16f908a00
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.unordered_write: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.row_cache: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                              Options.wal_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.two_write_queues: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.wal_compression: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.atomic_flush: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_background_jobs: 4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_background_compactions: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_subcompactions: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.max_open_files: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Compression algorithms supported:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZSTD supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kXpressCompression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kZlibCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811680)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea37350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:           Options.merge_operator: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d16f811ac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55d16ea369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.write_buffer_size: 16777216
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.max_write_buffer_number: 64
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.compression: LZ4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.num_levels: 7
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0b9759fc-282b-40dc-a20d-ecaf65ebba52
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421883439, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421888981, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421892035, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421895337, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420421, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0b9759fc-282b-40dc-a20d-ecaf65ebba52", "db_session_id": "GDOX11IZ247TJWEOACQN", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420421897133, "job": 1, "event": "recovery_finished"}
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d16f9d8000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: DB pointer 0x55d16f9b8000
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Bloc
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: _get_class not permitted to load lua
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: _get_class not permitted to load sdk
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 load_pgs
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 load_pgs opened 0 pgs
Jan 26 04:40:21 np0005595445 ceph-osd[77632]: osd.1 0 log_to_monitors true
Jan 26 04:40:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:21.928+0000 7fad5994c740 -1 osd.1 0 log_to_monitors true
Jan 26 04:40:22 np0005595445 podman[78462]: 2026-01-26 09:40:22.741738877 +0000 UTC m=+0.066424739 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 26 04:40:22 np0005595445 podman[78462]: 2026-01-26 09:40:22.848239439 +0000 UTC m=+0.172925301 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 26 04:40:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 26 04:40:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 done with init, starting boot process
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 start_boot
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 26 04:40:24 np0005595445 ceph-osd[77632]: osd.1 0  bench count 12288000 bsize 4 KiB
Jan 26 04:40:24 np0005595445 podman[78684]: 2026-01-26 09:40:24.219778748 +0000 UTC m=+0.056018311 container create 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 26 04:40:24 np0005595445 systemd[1]: Started libpod-conmon-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope.
Jan 26 04:40:24 np0005595445 podman[78684]: 2026-01-26 09:40:24.188318009 +0000 UTC m=+0.024557602 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:24 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:25 np0005595445 podman[78684]: 2026-01-26 09:40:25.036058505 +0000 UTC m=+0.872298178 container init 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:40:25 np0005595445 podman[78684]: 2026-01-26 09:40:25.048770288 +0000 UTC m=+0.885009851 container start 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 26 04:40:25 np0005595445 silly_heyrovsky[78701]: 167 167
Jan 26 04:40:25 np0005595445 systemd[1]: libpod-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope: Deactivated successfully.
Jan 26 04:40:25 np0005595445 podman[78684]: 2026-01-26 09:40:25.070181959 +0000 UTC m=+0.906421542 container attach 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 04:40:25 np0005595445 podman[78684]: 2026-01-26 09:40:25.071528988 +0000 UTC m=+0.907768551 container died 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 04:40:25 np0005595445 systemd[1]: var-lib-containers-storage-overlay-bc29a694281a3ce8f96c632fb35c2eed8c3cfe5635785b528ae0a08ae271faee-merged.mount: Deactivated successfully.
Jan 26 04:40:25 np0005595445 podman[78684]: 2026-01-26 09:40:25.189141258 +0000 UTC m=+1.025380821 container remove 7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_heyrovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Jan 26 04:40:25 np0005595445 systemd[1]: libpod-conmon-7557bcb0adb3e119edb8d915ceea666e854084fbd72d79f2d97052dbbeb3e992.scope: Deactivated successfully.
Jan 26 04:40:25 np0005595445 podman[78730]: 2026-01-26 09:40:25.357144957 +0000 UTC m=+0.054641073 container create 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 04:40:25 np0005595445 systemd[1]: Started libpod-conmon-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope.
Jan 26 04:40:25 np0005595445 podman[78730]: 2026-01-26 09:40:25.326149251 +0000 UTC m=+0.023645267 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:40:25 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:40:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:40:25 np0005595445 podman[78730]: 2026-01-26 09:40:25.462499585 +0000 UTC m=+0.159995591 container init 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 26 04:40:25 np0005595445 podman[78730]: 2026-01-26 09:40:25.468478346 +0000 UTC m=+0.165974342 container start 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 04:40:25 np0005595445 podman[78730]: 2026-01-26 09:40:25.488923281 +0000 UTC m=+0.186419277 container attach 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]: [
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:    {
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "available": false,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "being_replaced": false,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "ceph_device_lvm": false,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "lsm_data": {},
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "lvs": [],
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "path": "/dev/sr0",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "rejected_reasons": [
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "Insufficient space (<5GB)",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "Has a FileSystem"
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        ],
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        "sys_api": {
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "actuators": null,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "device_nodes": [
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:                "sr0"
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            ],
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "devname": "sr0",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "human_readable_size": "482.00 KB",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "id_bus": "ata",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "model": "QEMU DVD-ROM",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "nr_requests": "2",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "parent": "/dev/sr0",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "partitions": {},
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "path": "/dev/sr0",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "removable": "1",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "rev": "2.5+",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "ro": "0",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "rotational": "1",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "sas_address": "",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "sas_device_handle": "",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "scheduler_mode": "mq-deadline",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "sectors": 0,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "sectorsize": "2048",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "size": 493568.0,
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "support_discard": "2048",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "type": "disk",
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:            "vendor": "QEMU"
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:        }
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]:    }
Jan 26 04:40:26 np0005595445 heuristic_maxwell[78747]: ]
Jan 26 04:40:26 np0005595445 systemd[1]: libpod-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope: Deactivated successfully.
Jan 26 04:40:26 np0005595445 podman[78730]: 2026-01-26 09:40:26.151857467 +0000 UTC m=+0.849353473 container died 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:40:26 np0005595445 systemd[1]: var-lib-containers-storage-overlay-4de7a816c009eca28a69e0829b257d3277588570b1ed8e2b1efb8419d536b509-merged.mount: Deactivated successfully.
Jan 26 04:40:26 np0005595445 podman[78730]: 2026-01-26 09:40:26.264170386 +0000 UTC m=+0.961666382 container remove 257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=heuristic_maxwell, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:40:26 np0005595445 systemd[1]: libpod-conmon-257ce75e425f1d24a6afb55e29044312264c56830c9593b453072cdc645cdea9.scope: Deactivated successfully.
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 27.506 iops: 7041.512 elapsed_sec: 0.426
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [WRN] : OSD bench result of 7041.512344 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 0 waiting for initial osdmap
Jan 26 04:40:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:28.344+0000 7fad560e2640 -1 osd.1 0 waiting for initial osdmap
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 check_osdmap_features require_osd_release unknown -> squid
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 set_numa_affinity not setting numa affinity
Jan 26 04:40:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-osd-1[77628]: 2026-01-26T09:40:28.367+0000 7fad50ef7640 -1 osd.1 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 26 04:40:28 np0005595445 ceph-osd[77632]: osd.1 9 state: booting -> active
Jan 26 04:40:31 np0005595445 ceph-osd[77632]: osd.1 10 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 26 04:40:31 np0005595445 ceph-osd[77632]: osd.1 10 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Jan 26 04:40:31 np0005595445 ceph-osd[77632]: osd.1 10 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 26 04:40:31 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 10 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:40:31 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 11 pg[1.0( empty local-lis/les=10/11 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [1] r=0 lpr=10 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.255551518 +0000 UTC m=+0.057986854 container create f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 26 04:41:04 np0005595445 systemd[1]: Started libpod-conmon-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope.
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.222874966 +0000 UTC m=+0.025310392 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:04 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.3380223 +0000 UTC m=+0.140457656 container init f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.344115526 +0000 UTC m=+0.146550862 container start f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.346622035 +0000 UTC m=+0.149057371 container attach f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Jan 26 04:41:04 np0005595445 interesting_morse[79890]: 167 167
Jan 26 04:41:04 np0005595445 systemd[1]: libpod-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope: Deactivated successfully.
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.350255534 +0000 UTC m=+0.152690870 container died f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True)
Jan 26 04:41:04 np0005595445 systemd[1]: var-lib-containers-storage-overlay-c4f5786995e166730047d73ab55ff7fc8f5a8e3ead4465b53d811eec2110bbae-merged.mount: Deactivated successfully.
Jan 26 04:41:04 np0005595445 podman[79874]: 2026-01-26 09:41:04.385499556 +0000 UTC m=+0.187934902 container remove f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=interesting_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 26 04:41:04 np0005595445 systemd[1]: libpod-conmon-f4dceb8e41c583fa6d4e1667a966e6de2f12d7ac363b290d4b2134f9914df149.scope: Deactivated successfully.
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.450763848 +0000 UTC m=+0.043559211 container create f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:41:04 np0005595445 systemd[1]: Started libpod-conmon-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope.
Jan 26 04:41:04 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:04 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:04 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:04 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:04 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.514683433 +0000 UTC m=+0.107478796 container init f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.520598844 +0000 UTC m=+0.113394207 container start f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True)
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.523454552 +0000 UTC m=+0.116249935 container attach f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.43254533 +0000 UTC m=+0.025340723 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:04 np0005595445 systemd[1]: libpod-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope: Deactivated successfully.
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.618897947 +0000 UTC m=+0.211693330 container died f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:41:04 np0005595445 systemd[1]: var-lib-containers-storage-overlay-826be8ffc87ec79a65fbc260f07deaaaefefd91c9203b84aeb29d9a684a5556e-merged.mount: Deactivated successfully.
Jan 26 04:41:04 np0005595445 podman[79908]: 2026-01-26 09:41:04.650225492 +0000 UTC m=+0.243020855 container remove f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_tharp, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 04:41:04 np0005595445 systemd[1]: libpod-conmon-f2a20f0bab75e1bc312172474ced89ee318792eb49262d2dd46deb1b21807bad.scope: Deactivated successfully.
Jan 26 04:41:04 np0005595445 systemd[1]: Reloading.
Jan 26 04:41:04 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:41:04 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:41:04 np0005595445 systemd[1]: Reloading.
Jan 26 04:41:05 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:41:05 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:41:05 np0005595445 systemd[1]: Starting Ceph mon.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:41:05 np0005595445 podman[80087]: 2026-01-26 09:41:05.452736381 +0000 UTC m=+0.041458703 container create 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:41:05 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:05 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:05 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:05 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f4b4a876e7c967071e40ff8fe7a84132314fd625a16463acebae90d669257e8/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:05 np0005595445 podman[80087]: 2026-01-26 09:41:05.519487252 +0000 UTC m=+0.108209594 container init 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:41:05 np0005595445 podman[80087]: 2026-01-26 09:41:05.525025553 +0000 UTC m=+0.113747875 container start 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mon-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:41:05 np0005595445 bash[80087]: 0913a1c63c0c3a30499c5537209d0784e72b8023efea53f6d17a9d4102eb8fe2
Jan 26 04:41:05 np0005595445 podman[80087]: 2026-01-26 09:41:05.433553937 +0000 UTC m=+0.022276279 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:05 np0005595445 systemd[1]: Started Ceph mon.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: pidfile_write: ignore empty --pid-file
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: load: jerasure load: lrc 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: RocksDB version: 7.9.2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Git sha 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: DB SUMMARY
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: DB Session ID:  OSRMBNXDC8EXU3R2EA69
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: CURRENT file:  CURRENT
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: IDENTITY file:  IDENTITY
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 636 ; 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                         Options.error_if_exists: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.create_if_missing: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                         Options.paranoid_checks: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                                     Options.env: 0x55af2b350c20
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                                Options.info_log: 0x55af2cafda20
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.max_file_opening_threads: 16
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                              Options.statistics: (nil)
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                               Options.use_fsync: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.max_log_file_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                         Options.allow_fallocate: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.use_direct_reads: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.create_missing_column_families: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                              Options.db_log_dir: 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                                 Options.wal_dir: 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.advise_random_on_open: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                    Options.write_buffer_manager: 0x55af2cb01900
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                            Options.rate_limiter: (nil)
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.unordered_write: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                               Options.row_cache: None
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                              Options.wal_filter: None
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.allow_ingest_behind: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.two_write_queues: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.manual_wal_flush: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.wal_compression: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.atomic_flush: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.log_readahead_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.allow_data_in_errors: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.db_host_id: __hostname__
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.max_background_jobs: 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.max_background_compactions: -1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.max_subcompactions: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.max_total_wal_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                          Options.max_open_files: -1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                          Options.bytes_per_sync: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:       Options.compaction_readahead_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.max_background_flushes: -1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Compression algorithms supported:
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kZSTD supported: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kXpressCompression supported: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kBZip2Compression supported: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kLZ4Compression supported: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kZlibCompression supported: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: #011kSnappyCompression supported: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:           Options.merge_operator: 
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:        Options.compaction_filter: None
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:        Options.compaction_filter_factory: None
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:  Options.sst_partitioner_factory: None
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55af2cafd6a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55af2cb209b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:        Options.write_buffer_size: 33554432
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:  Options.max_write_buffer_number: 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.compression: NoCompression
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:       Options.prefix_extractor: nullptr
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.num_levels: 7
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.compression_opts.level: 32767
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:               Options.compression_opts.strategy: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                  Options.compression_opts.enabled: false
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.arena_block_size: 1048576
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.disable_auto_compactions: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.inplace_update_support: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                           Options.bloom_locality: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                    Options.max_successive_merges: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.paranoid_file_checks: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.force_consistency_checks: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.report_bg_io_stats: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                               Options.ttl: 2592000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                       Options.enable_blob_files: false
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                           Options.min_blob_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                          Options.blob_file_size: 268435456
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb:                Options.blob_file_starting_level: 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4e64dd99-608f-448d-a4f8-af05bb4d42d8
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465572110, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465574602, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1773, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420465574753, "job": 1, "event": "recovery_finished"}
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55af2cb22e00
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: DB pointer 0x55af2cb32000
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.73 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.73 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 512.00 MB usage: 0.98 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.77 KB,0.000146031%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(???) e0 preinit fsid 1a70b85d-e3fd-5814-8a6a-37ea00fcae30
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012btime 2026-01-26T09:38:21:975599+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 26 04:41:05 np0005595445 ceph-mon[80107]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 26 04:41:07 np0005595445 ceph-mon[80107]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 26 04:41:07 np0005595445 ceph-mon[80107]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 26 04:41:07 np0005595445 ceph-mon[80107]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 26 04:41:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864304,os=Linux}
Jan 26 04:41:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e13 e13: 2 total, 2 up, 2 in
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: Deploying daemon mgr.compute-2.oynaeu on compute-2
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3981712437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: mon.compute-0 calling monitor election
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: mon.compute-2 calling monitor election
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: mon.compute-1 calling monitor election
Jan 26 04:41:11 np0005595445 ceph-mon[80107]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 26 04:41:11 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 13 pg[2.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.743990975 +0000 UTC m=+0.043257962 container create c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:41:11 np0005595445 systemd[1]: Started libpod-conmon-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope.
Jan 26 04:41:11 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.724481392 +0000 UTC m=+0.023748399 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.82621106 +0000 UTC m=+0.125478067 container init c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True)
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.834104934 +0000 UTC m=+0.133371921 container start c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.838221477 +0000 UTC m=+0.137488494 container attach c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 04:41:11 np0005595445 upbeat_matsumoto[80254]: 167 167
Jan 26 04:41:11 np0005595445 systemd[1]: libpod-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope: Deactivated successfully.
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.840802658 +0000 UTC m=+0.140069675 container died c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:41:11 np0005595445 systemd[1]: var-lib-containers-storage-overlay-ab8936b293513bdd83643f1cb105d585fdca2440dfb7f9630c13e584ac026faa-merged.mount: Deactivated successfully.
Jan 26 04:41:11 np0005595445 podman[80237]: 2026-01-26 09:41:11.878439455 +0000 UTC m=+0.177706442 container remove c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_matsumoto, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:41:11 np0005595445 systemd[1]: libpod-conmon-c26e552561eee9962ccbb140b8a7d3b49ec454284392c55d2ce70be6f0c3f39c.scope: Deactivated successfully.
Jan 26 04:41:11 np0005595445 systemd[1]: Reloading.
Jan 26 04:41:11 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:41:11 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:41:12 np0005595445 systemd[1]: Reloading.
Jan 26 04:41:12 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:41:12 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: overall HEALTH_OK
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3981712437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.xammti", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.xammti", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: Deploying daemon mgr.compute-1.xammti on compute-1
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3023141661' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:12 np0005595445 systemd[1]: Starting Ceph mgr.compute-1.xammti for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 26 04:41:12 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 14 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [1] r=0 lpr=13 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:12 np0005595445 podman[80397]: 2026-01-26 09:41:12.729397055 +0000 UTC m=+0.067585006 container create 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Jan 26 04:41:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/209a1586a896d1660d6a6956909ce330bbbef0c072dfc98a2bca17967f7a0a06/merged/var/lib/ceph/mgr/ceph-compute-1.xammti supports timestamps until 2038 (0x7fffffff)
Jan 26 04:41:12 np0005595445 podman[80397]: 2026-01-26 09:41:12.688936661 +0000 UTC m=+0.027124632 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:12 np0005595445 podman[80397]: 2026-01-26 09:41:12.792084206 +0000 UTC m=+0.130272187 container init 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:41:12 np0005595445 podman[80397]: 2026-01-26 09:41:12.797234557 +0000 UTC m=+0.135422508 container start 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 04:41:12 np0005595445 bash[80397]: 78ca290e02e494a62ed6e7db1247d89a717a3aab43caa9273ac85407686f6e4e
Jan 26 04:41:12 np0005595445 systemd[1]: Started Ceph mgr.compute-1.xammti for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 04:41:12 np0005595445 ceph-mgr[80416]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:41:12 np0005595445 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 04:41:12 np0005595445 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 04:41:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 26 04:41:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 04:41:13 np0005595445 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:41:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 04:41:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:13.158+0000 7f66329a4140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:41:13 np0005595445 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:41:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 04:41:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:13.246+0000 7f66329a4140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:41:13 np0005595445 ceph-mon[80107]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:13 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3023141661' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:13 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:13 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 26 04:41:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 04:41:14 np0005595445 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:41:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:14.465+0000 7f66329a4140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:41:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: Deploying daemon crash.compute-2 on compute-2
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1199163324' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:14 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e16 e16: 2 total, 2 up, 2 in
Jan 26 04:41:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e16 _set_new_cache_sizes cache_size:1019934124 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 04:41:15 np0005595445 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:41:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 04:41:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:15.739+0000 7f66329a4140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:41:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e17 e17: 2 total, 2 up, 2 in
Jan 26 04:41:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 04:41:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 04:41:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:  from numpy import show_config as show_numpy_config
Jan 26 04:41:15 np0005595445 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:41:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:15.989+0000 7f66329a4140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:41:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1199163324' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:41:16 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:41:16 np0005595445 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:41:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 04:41:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:16.080+0000 7f66329a4140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:41:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 04:41:16 np0005595445 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:41:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 04:41:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:16.570+0000 7f66329a4140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:41:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 04:41:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 04:41:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 04:41:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 04:41:18 np0005595445 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:41:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:18.045+0000 7f66329a4140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:41:18 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 04:41:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e18 e18: 2 total, 2 up, 2 in
Jan 26 04:41:18 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:18 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1746553743' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.077+0000 7f66329a4140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.175+0000 7f66329a4140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.401+0000 7f66329a4140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:19.833+0000 7f66329a4140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:41:19 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 04:41:20 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/4005713193' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]: dispatch
Jan 26 04:41:20 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]: dispatch
Jan 26 04:41:20 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1746553743' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:20 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "cdaf6859-268c-4a38-b792-ad916b17c334"}]': finished
Jan 26 04:41:20 np0005595445 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:41:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:20.230+0000 7f66329a4140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:41:20 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 04:41:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e19 _set_new_cache_sizes cache_size:1020053172 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:21 np0005595445 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:41:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:21.455+0000 7f66329a4140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:41:21 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 04:41:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 26 04:41:21 np0005595445 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:41:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:21.596+0000 7f66329a4140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:41:21 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 04:41:21 np0005595445 ceph-mon[80107]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:21 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/929823694' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:21 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 04:41:22 np0005595445 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:41:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:22.316+0000 7f66329a4140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:41:22 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 04:41:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 26 04:41:23 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/929823694' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.583+0000 7f66329a4140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 04:41:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.674+0000 7f66329a4140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 04:41:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.780+0000 7f66329a4140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 04:41:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:41:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 04:41:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:23.966+0000 7f66329a4140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:41:24 np0005595445 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:41:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:24.054+0000 7f66329a4140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:41:24 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 04:41:24 np0005595445 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:41:24 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 04:41:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:24.693+0000 7f66329a4140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 04:41:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.152+0000 7f66329a4140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 04:41:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.484+0000 7f66329a4140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:41:25.575+0000 7f66329a4140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:41:25 np0005595445 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x561413592d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Jan 26 04:41:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e23 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:26 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1985194690' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 26 04:41:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 26 04:41:27 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 24 pg[7.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:28 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1985194690' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 26 04:41:28 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 26 04:41:28 np0005595445 ceph-mon[80107]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 26 04:41:28 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 25 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [1] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:29 np0005595445 ceph-mon[80107]: Deploying daemon osd.2 on compute-2
Jan 26 04:41:29 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2556293450' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 26 04:41:29 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 26 04:41:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:30 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:30 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2556293450' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 26 04:41:32 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:32 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:32 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1132920361' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 26 04:41:32 np0005595445 ceph-mon[80107]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 26 04:41:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 26 04:41:34 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1132920361' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 26 04:41:34 np0005595445 ceph-mon[80107]: from='osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 04:41:34 np0005595445 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 26 04:41:34 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:34 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:35 np0005595445 podman[80594]: 2026-01-26 09:41:35.082228092 +0000 UTC m=+0.119316858 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:41:35 np0005595445 podman[80594]: 2026-01-26 09:41:35.224078065 +0000 UTC m=+0.261166791 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e29 e29: 3 total, 2 up, 3 in
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2154943776' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 26 04:41:35 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2154943776' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2817412888' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 26 04:41:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e30 e30: 3 total, 2 up, 3 in
Jan 26 04:41:38 np0005595445 ceph-mon[80107]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:38 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2817412888' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 26 04:41:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Jan 26 04:41:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3209549506' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: Unable to set osd_memory_target on compute-2 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 04:41:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e32 e32: 3 total, 3 up, 3 in
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: OSD bench result of 4662.749970 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3209549506' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: osd.2 [v2:192.168.122.102:6800/4046341804,v1:192.168.122.102:6801/4046341804] boot
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:42 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:41:44 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3224045909' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 26 04:41:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 26 04:41:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:45 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/3224045909' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 26 04:41:47 np0005595445 ceph-mon[80107]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 26 04:41:47 np0005595445 ceph-mon[80107]: Cluster is now healthy
Jan 26 04:41:47 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 26 04:41:48 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:48 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:48 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 26 04:41:48 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36 pruub=11.864285469s) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active pruub 98.601074219s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:41:48 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=13/14 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36 pruub=11.864285469s) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown pruub 98.601074219s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=13/14 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:49 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:49 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:49 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.0( empty local-lis/les=36/37 n=0 ec=13/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=13/13 les/c/f=14/14/0 sis=36) [1] r=0 lpr=36 pi=[13,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Jan 26 04:41:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Jan 26 04:41:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 26 04:41:50 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:50 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:50 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1d deep-scrub starts
Jan 26 04:41:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1d deep-scrub ok
Jan 26 04:41:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Jan 26 04:41:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/4014478342' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 26 04:41:52 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/4014478342' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 26 04:41:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.9 deep-scrub starts
Jan 26 04:41:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.9 deep-scrub ok
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1576342763' entity='client.admin' 
Jan 26 04:41:53 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:41:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.8 deep-scrub starts
Jan 26 04:41:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.8 deep-scrub ok
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.zllcia", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 26 04:41:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 41 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.039972305s) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 106.636978149s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.039972305s) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown pruub 106.636978149s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.d( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.7( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.b( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.10( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.12( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.14( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.16( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.17( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.19( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1d( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1e( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=24/25 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: Saving service ingress.rgw.default spec with placement count:2
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: Reconfiguring mgr.compute-0.zllcia (monmap changed)...
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: Reconfiguring daemon mgr.compute-0.zllcia on compute-0
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 04:41:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1c( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: Reconfiguring osd.0 (monmap changed)...
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: Reconfiguring daemon osd.0 on compute-0
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:56 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.12( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.17( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.15( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.0( empty local-lis/les=41/43 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.7( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.c( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.19( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 43 pg[7.1a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=24/24 les/c/f=25/25/0 sis=41) [1] r=0 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.703684593 +0000 UTC m=+0.044999409 container create b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 26 04:41:56 np0005595445 systemd[72713]: Starting Mark boot as successful...
Jan 26 04:41:56 np0005595445 systemd[72713]: Finished Mark boot as successful.
Jan 26 04:41:56 np0005595445 systemd[1]: Started libpod-conmon-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope.
Jan 26 04:41:56 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.683823281 +0000 UTC m=+0.025138117 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.785952549 +0000 UTC m=+0.127267385 container init b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.795930851 +0000 UTC m=+0.137245667 container start b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.802324066 +0000 UTC m=+0.143638902 container attach b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 26 04:41:56 np0005595445 infallible_chaplygin[81314]: 167 167
Jan 26 04:41:56 np0005595445 systemd[1]: libpod-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope: Deactivated successfully.
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.806649424 +0000 UTC m=+0.147964270 container died b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 26 04:41:56 np0005595445 systemd[1]: var-lib-containers-storage-overlay-7b09fe1a64abaaa3d1bc139993fab6f61383ba137d0e44997a3af79b23d69153-merged.mount: Deactivated successfully.
Jan 26 04:41:56 np0005595445 podman[81297]: 2026-01-26 09:41:56.850682446 +0000 UTC m=+0.191997262 container remove b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_chaplygin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Jan 26 04:41:56 np0005595445 systemd[1]: libpod-conmon-b29931f2f71444d70514aabec122e1b84092c0101c95c0d86b2c2cb4570f2332.scope: Deactivated successfully.
Jan 26 04:41:57 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Jan 26 04:41:57 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Jan 26 04:41:57 np0005595445 podman[81397]: 2026-01-26 09:41:57.932317524 +0000 UTC m=+0.062073736 container create cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Saving service node-exporter spec with placement *
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Saving service grafana spec with placement compute-0;count:1
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Saving service prometheus spec with placement compute-0;count:1
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Saving service alertmanager spec with placement compute-0;count:1
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:57 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 26 04:41:57 np0005595445 systemd[1]: Started libpod-conmon-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope.
Jan 26 04:41:57 np0005595445 podman[81397]: 2026-01-26 09:41:57.901174004 +0000 UTC m=+0.030930246 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:58 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:58 np0005595445 podman[81397]: 2026-01-26 09:41:58.025889928 +0000 UTC m=+0.155646190 container init cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid)
Jan 26 04:41:58 np0005595445 podman[81397]: 2026-01-26 09:41:58.038415054 +0000 UTC m=+0.168171256 container start cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 26 04:41:58 np0005595445 podman[81397]: 2026-01-26 09:41:58.042758224 +0000 UTC m=+0.172514496 container attach cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:41:58 np0005595445 reverent_cohen[81413]: 167 167
Jan 26 04:41:58 np0005595445 systemd[1]: libpod-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope: Deactivated successfully.
Jan 26 04:41:58 np0005595445 conmon[81413]: conmon cccf79dfe42342f2e344 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope/container/memory.events
Jan 26 04:41:58 np0005595445 podman[81397]: 2026-01-26 09:41:58.046461187 +0000 UTC m=+0.176217399 container died cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 04:41:58 np0005595445 systemd[1]: var-lib-containers-storage-overlay-6b509946277c5ffa87f095919ad2a2bff52212c4129d80b8e30f1f974f708959-merged.mount: Deactivated successfully.
Jan 26 04:41:58 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Jan 26 04:41:58 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Jan 26 04:41:58 np0005595445 podman[81397]: 2026-01-26 09:41:58.094145717 +0000 UTC m=+0.223901889 container remove cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=reverent_cohen, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 26 04:41:58 np0005595445 systemd[1]: libpod-conmon-cccf79dfe42342f2e3443718be4ba9c6062e257ccf57674f6bf35544b8066bf9.scope: Deactivated successfully.
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.914959797 +0000 UTC m=+0.036147222 container create cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: Reconfiguring osd.1 (monmap changed)...
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: Reconfiguring daemon osd.1 on compute-1
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/468906199' entity='client.admin' 
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 26 04:41:58 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/4184262969' entity='client.admin' 
Jan 26 04:41:58 np0005595445 systemd[1]: Started libpod-conmon-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope.
Jan 26 04:41:58 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.984309697 +0000 UTC m=+0.105497152 container init cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.989519291 +0000 UTC m=+0.110706716 container start cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:41:58 np0005595445 cool_curran[81517]: 167 167
Jan 26 04:41:58 np0005595445 systemd[1]: libpod-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope: Deactivated successfully.
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.898786919 +0000 UTC m=+0.019974364 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.995153037 +0000 UTC m=+0.116340482 container attach cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 26 04:41:58 np0005595445 podman[81501]: 2026-01-26 09:41:58.995681703 +0000 UTC m=+0.116869128 container died cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:41:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay-95fe52c1ffe4c834bd214a626e765ebc17c43ad3d55c91808874fba0a5fa77c4-merged.mount: Deactivated successfully.
Jan 26 04:41:59 np0005595445 podman[81501]: 2026-01-26 09:41:59.032028459 +0000 UTC m=+0.153215884 container remove cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_curran, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 04:41:59 np0005595445 systemd[1]: libpod-conmon-cd06c5cdd94519ec9b0c636ea140ff9c7b9c570d14bb140c1466a6aadb6fdef9.scope: Deactivated successfully.
Jan 26 04:41:59 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 26 04:41:59 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.0 scrub starts
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.0 scrub ok
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.18( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.1c( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1b( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1c( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.f( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.3( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.7( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.a( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.d( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.c( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.f( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.16( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.13( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.15( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.14( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.11( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[3.16( empty local-lis/les=0/0 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.10( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[5.1f( empty local-lis/les=0/0 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759461403s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621505737s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927159309s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789230347s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927136421s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789222717s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1d( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759422302s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621505737s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927088737s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789222717s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.927085876s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789230347s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759243965s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621566772s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926724434s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789070129s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.10( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759223938s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621566772s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926703453s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789070129s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926499367s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789016724s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926483154s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789016724s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759040833s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621620178s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758980751s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621597290s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.11( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.759001732s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621620178s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926407814s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789062500s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.16( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758960724s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621597290s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.926392555s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789062500s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752825737s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.615554810s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752787590s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.615554810s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925808907s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788749695s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758676529s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621643066s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925792694s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788749695s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752552032s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.615554810s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.14( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758651733s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621643066s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925723076s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788749695s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.13( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.752522469s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.615554810s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925703049s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788749695s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758640289s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621711731s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.a( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758584976s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621711731s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925417900s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788589478s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925391197s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788589478s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925354004s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788589478s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758435249s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621688843s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925333977s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788589478s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758410454s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621688843s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925474167s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788848877s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758394241s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621780396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.9( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758376122s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621780396s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758297920s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621726990s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925447464s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788848877s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758279800s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621726990s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925062180s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788543701s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925036430s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788543701s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758199692s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621826172s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.5( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758179665s) [2] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621826172s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758061409s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621742249s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924700737s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788383484s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924477577s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788177490s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924451828s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788177490s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924640656s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788383484s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.6( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758026123s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621742249s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.758007050s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621833801s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.4( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757991791s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621833801s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924235344s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788177490s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925599098s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.789573669s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.924210548s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788177490s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757957458s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621955872s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.925580025s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.789573669s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.3( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757932663s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621955872s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757944107s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621994019s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.2( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757916451s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621994019s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923816681s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788032532s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757732391s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621994019s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923790932s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788032532s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.f( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757705688s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621994019s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757695198s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622009277s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923678398s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788002014s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1e( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757678986s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622009277s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923655510s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788002014s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923947334s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788406372s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923660278s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.788131714s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923925400s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788406372s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923631668s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.788131714s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914971352s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.779518127s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757479668s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622047424s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914941788s) [2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.779518127s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.18( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757456779s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622047424s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757425308s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.622062683s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923221588s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.787895203s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.1b( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.757405281s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.622062683s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.923195839s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.787895203s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914700508s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 111.779487610s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/13 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=12.914666176s) [0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 111.779487610s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.756810188s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active pruub 110.621734619s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[7.8( empty local-lis/les=41/43 n=0 ec=41/24 lis/c=41/41 les/c/f=43/43/0 sis=44 pruub=11.756777763s) [0] r=-1 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 110.621734619s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.1a( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.18( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.15( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.13( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.d( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.c( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.e( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.3( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.5( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.7( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.8( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.1b( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.19( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.1a( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.5( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[6.a( empty local-lis/les=0/0 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 44 pg[4.a( empty local-lis/les=0/0 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/2724000136' entity='client.admin' 
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.1b( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.7( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.18( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.1a( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1c( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.1a( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.e( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.e( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.d( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.3( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.7( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.2( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.8( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.d( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.c( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.19( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.5( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.d( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.a( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.f( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.9( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.10( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.c( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.15( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.14( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.16( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.15( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[4.13( empty local-lis/les=44/45 n=0 ec=39/16 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.13( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.11( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.10( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.16( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1f( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.a( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.a( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.2( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[6.5( empty local-lis/les=44/45 n=0 ec=41/21 lis/c=41/41 les/c/f=42/42/0 sis=44) [1] r=0 lpr=44 pi=[41,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.3( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.f( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.5( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[3.1c( empty local-lis/les=44/45 n=0 ec=37/14 lis/c=37/37 les/c/f=38/38/0 sis=44) [1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.18( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 45 pg[5.1b( empty local-lis/les=44/45 n=0 ec=39/18 lis/c=39/39 les/c/f=40/40/0 sis=44) [1] r=0 lpr=44 pi=[39,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:01 np0005595445 podman[81653]: 2026-01-26 09:42:01.84591933 +0000 UTC m=+0.072795076 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:42:01 np0005595445 podman[81653]: 2026-01-26 09:42:01.972227838 +0000 UTC m=+0.199103584 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True)
Jan 26 04:42:02 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 26 04:42:02 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:02 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:42:03 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 26 04:42:03 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 26 04:42:03 np0005595445 python3[81764]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:42:03 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/308369494' entity='client.admin' 
Jan 26 04:42:04 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 26 04:42:04 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 26 04:42:04 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1533391079' entity='client.admin' 
Jan 26 04:42:05 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 26 04:42:05 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 26 04:42:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:06 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 26 04:42:06 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 26 04:42:07 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1831666463' entity='client.admin' 
Jan 26 04:42:07 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Jan 26 04:42:07 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Jan 26 04:42:08 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:08 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 26 04:42:08 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 26 04:42:09 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 26 04:42:09 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1606551457' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.fgzdbm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.fgzdbm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:09 np0005595445 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-2.fgzdbm on compute-2
Jan 26 04:42:10 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Jan 26 04:42:10 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Jan 26 04:42:10 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1606551457' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 26 04:42:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.794671323 +0000 UTC m=+0.037591151 container create 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:42:10 np0005595445 systemd[1]: Started libpod-conmon-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope.
Jan 26 04:42:10 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.860548228 +0000 UTC m=+0.103468076 container init 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.865910997 +0000 UTC m=+0.108830825 container start 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.868628442 +0000 UTC m=+0.111548290 container attach 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 26 04:42:10 np0005595445 cranky_yonath[81886]: 167 167
Jan 26 04:42:10 np0005595445 systemd[1]: libpod-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope: Deactivated successfully.
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.87039615 +0000 UTC m=+0.113315978 container died 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True)
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.778722082 +0000 UTC m=+0.021641930 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:42:10 np0005595445 systemd[1]: var-lib-containers-storage-overlay-91892cd777ceefaf9d1f2af3a86edfb96691207ba4586a4cd92f4d1bb58e0f86-merged.mount: Deactivated successfully.
Jan 26 04:42:10 np0005595445 podman[81870]: 2026-01-26 09:42:10.899717823 +0000 UTC m=+0.142637651 container remove 77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cranky_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Jan 26 04:42:10 np0005595445 systemd[1]: libpod-conmon-77b1d608585c1a0bb7217e12a78800b2a94c3442ae259ee727757c8d7d00aeb0.scope: Deactivated successfully.
Jan 26 04:42:10 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:11 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:11 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/588199508' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: from='mgr.14122 192.168.122.100:0/1352844427' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-1.fbcidm on compute-1
Jan 26 04:42:11 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Jan 26 04:42:11 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 04:42:11 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 26 04:42:11 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 04:42:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 04:42:11 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 04:42:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:11.436+0000 7f4bb702e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:11 np0005595445 systemd[1]: session-27.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-23.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-25.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 25.
Jan 26 04:42:11 np0005595445 systemd[1]: session-22.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-20.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 22.
Jan 26 04:42:11 np0005595445 systemd[1]: session-31.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-29.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-30.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 27 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 32 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 20 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd[1]: session-28.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 29 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd[1]: session-24.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-26.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 31 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd[1]: Starting Ceph rgw.rgw.compute-1.fbcidm for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 23 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 30 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 28 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 24 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Session 26 logged out. Waiting for processes to exit.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 27.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 23.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 20.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 31.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 29.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 30.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 28.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 24.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 26.
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:11 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 04:42:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:11.552+0000 7f4bb702e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:11 np0005595445 podman[82045]: 2026-01-26 09:42:11.742941053 +0000 UTC m=+0.038283771 container create 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:42:11 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:42:11 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:42:11 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:42:11 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef533ef4332b913a823ed51d28459eea8cb3456240c3cd6ba607a173abcf5e0a/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.fbcidm supports timestamps until 2038 (0x7fffffff)
Jan 26 04:42:11 np0005595445 podman[82045]: 2026-01-26 09:42:11.809327522 +0000 UTC m=+0.104670260 container init 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 04:42:11 np0005595445 podman[82045]: 2026-01-26 09:42:11.814047322 +0000 UTC m=+0.109390040 container start 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:42:11 np0005595445 bash[82045]: 4dd5c8bd1cc856abf0889650d4a4346dea8cfb7a23592a8467268a3371cecad4
Jan 26 04:42:11 np0005595445 podman[82045]: 2026-01-26 09:42:11.726645972 +0000 UTC m=+0.021988690 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:42:11 np0005595445 systemd[1]: Started Ceph rgw.rgw.compute-1.fbcidm for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:42:11 np0005595445 radosgw[82065]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:42:11 np0005595445 radosgw[82065]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 26 04:42:11 np0005595445 radosgw[82065]: framework: beast
Jan 26 04:42:11 np0005595445 radosgw[82065]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 26 04:42:11 np0005595445 radosgw[82065]: init_numa not setting numa affinity
Jan 26 04:42:11 np0005595445 systemd[1]: session-32.scope: Deactivated successfully.
Jan 26 04:42:11 np0005595445 systemd[1]: session-32.scope: Consumed 1min 37.906s CPU time.
Jan 26 04:42:11 np0005595445 systemd-logind[783]: Removed session 32.
Jan 26 04:42:12 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.2 deep-scrub starts
Jan 26 04:42:12 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 2.2 deep-scrub ok
Jan 26 04:42:12 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/588199508' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 26 04:42:12 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/2891557756' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 04:42:12 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 26 04:42:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 26 04:42:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 04:42:12 np0005595445 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 04:42:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:12.457+0000 7f4bb702e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.138+0000 7f4bb702e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 26 04:42:13 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:  from numpy import show_config as show_numpy_config
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.329+0000 7f4bb702e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 04:42:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 26 04:42:13 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.408+0000 7f4bb702e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:13.553+0000 7f4bb702e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 04:42:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 04:42:14 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 26 04:42:14 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 04:42:14 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 04:42:14 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 04:42:14 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 04:42:14 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 26 04:42:14 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:14.724+0000 7f4bb702e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 04:42:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:14.961+0000 7f4bb702e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.050+0000 7f4bb702e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 26 04:42:15 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.122+0000 7f4bb702e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.205+0000 7f4bb702e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.276+0000 7f4bb702e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 26 04:42:15 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 50 pg[10.0( empty local-lis/les=0/0 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:15 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 04:42:15 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 26 04:42:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.655+0000 7f4bb702e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 04:42:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:15.758+0000 7f4bb702e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 04:42:16 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 26 04:42:16 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 04:42:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.253+0000 7f4bb702e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 26 04:42:16 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 51 pg[10.0( empty local-lis/les=50/51 n=0 ec=50/50 lis/c=0/0 les/c/f=0/0/0 sis=50) [1] r=0 lpr=50 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 04:42:16 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 04:42:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.890+0000 7f4bb702e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 04:42:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:16.968+0000 7f4bb702e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.071+0000 7f4bb702e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 04:42:17 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 26 04:42:17 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.253+0000 7f4bb702e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.335+0000 7f4bb702e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 04:42:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 04:42:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.527+0000 7f4bb702e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:17.776+0000 7f4bb702e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:18.065+0000 7f4bb702e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 04:42:18 np0005595445 systemd-logind[783]: New session 33 of user ceph-admin.
Jan 26 04:42:18 np0005595445 systemd[1]: Started Session 33 of User ceph-admin.
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:18.144+0000 7f4bb702e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x559dd25af860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 04:42:18 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 26 04:42:18 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 26 04:42:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.102:0/1812478715' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.101:0/992292627' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 04:42:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 26 04:42:18 np0005595445 radosgw[82065]: v1 topic migration: starting v1 topic migration..
Jan 26 04:42:18 np0005595445 radosgw[82065]: LDAP not started since no server URIs were provided in the configuration.
Jan 26 04:42:18 np0005595445 radosgw[82065]: v1 topic migration: finished v1 topic migration
Jan 26 04:42:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-rgw-rgw-compute-1-fbcidm[82061]: 2026-01-26T09:42:18.781+0000 7f18dcb5b980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 26 04:42:18 np0005595445 radosgw[82065]: framework: beast
Jan 26 04:42:18 np0005595445 radosgw[82065]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 26 04:42:18 np0005595445 radosgw[82065]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 26 04:42:18 np0005595445 radosgw[82065]: starting handler: beast
Jan 26 04:42:18 np0005595445 podman[82802]: 2026-01-26 09:42:18.822111988 +0000 UTC m=+0.067907681 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 04:42:18 np0005595445 radosgw[82065]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:42:18 np0005595445 radosgw[82065]: mgrc service_daemon_register rgw.24169 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.fbcidm,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864304,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=88adcf12-6dc3-48b6-86bb-ed23fd934e78,zone_name=default,zonegroup_id=423841e2-30ae-45d1-92b7-7a24aa3d4488,zonegroup_name=default}
Jan 26 04:42:18 np0005595445 podman[82802]: 2026-01-26 09:42:18.917675954 +0000 UTC m=+0.163471667 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:42:19 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 26 04:42:19 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-2.fgzdbm' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='client.? ' entity='client.rgw.rgw.compute-1.fbcidm' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Bus STARTING
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Serving on https://192.168.122.100:7150
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Client ('192.168.122.100', 53286) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Serving on http://192.168.122.100:8765
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:19] ENGINE Bus STARTED
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:19 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Jan 26 04:42:20 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:42:20 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:21 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Jan 26 04:42:21 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Unable to set osd_memory_target on compute-0 to 134209126: error parsing value: Value '134209126' is below minimum 939524096
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:21 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 26 04:42:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:22 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1c deep-scrub starts
Jan 26 04:42:23 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1c deep-scrub ok
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='mgr.14358 192.168.122.100:0/344283424' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 26 04:42:23 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1657408057' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 04:42:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 04:42:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 04:42:23 np0005595445 systemd[1]: session-33.scope: Deactivated successfully.
Jan 26 04:42:23 np0005595445 systemd[1]: session-33.scope: Consumed 4.544s CPU time.
Jan 26 04:42:23 np0005595445 systemd-logind[783]: Session 33 logged out. Waiting for processes to exit.
Jan 26 04:42:23 np0005595445 systemd-logind[783]: Removed session 33.
Jan 26 04:42:23 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 04:42:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:24.086+0000 7f58e812f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:24 np0005595445 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:24 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 04:42:24 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 26 04:42:24 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 26 04:42:24 np0005595445 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:24 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 04:42:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:24.169+0000 7f58e812f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:24 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/1657408057' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 26 04:42:24 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/974467440' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 26 04:42:24 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.039+0000 7f58e812f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:25 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 26 04:42:25 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 26 04:42:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.703+0000 7f58e812f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:  from numpy import show_config as show_numpy_config
Jan 26 04:42:25 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/974467440' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:25 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 04:42:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:25.913+0000 7f58e812f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 04:42:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:26.009+0000 7f58e812f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:26 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.d deep-scrub starts
Jan 26 04:42:26 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.d deep-scrub ok
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 04:42:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:26.193+0000 7f58e812f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 04:42:26 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 04:42:27 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 26 04:42:27 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.326+0000 7f58e812f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.591+0000 7f58e812f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.694+0000 7f58e812f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.779+0000 7f58e812f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.875+0000 7f58e812f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:27 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 04:42:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:27.964+0000 7f58e812f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 26 04:42:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.339+0000 7f58e812f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 04:42:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.442+0000 7f58e812f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:28 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 04:42:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:28.886+0000 7f58e812f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.2 deep-scrub starts
Jan 26 04:42:29 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.2 deep-scrub ok
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 04:42:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.495+0000 7f58e812f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.577+0000 7f58e812f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.672+0000 7f58e812f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 04:42:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.832+0000 7f58e812f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:29.904+0000 7f58e812f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:29 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.065+0000 7f58e812f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 04:42:30 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 26 04:42:30 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.306+0000 7f58e812f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 04:42:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 26 04:42:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.608+0000 7f58e812f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.678+0000 7f58e812f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x559dd0457860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  1: '-n'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  2: 'mgr.compute-1.xammti'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  3: '-f'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  4: '--setuser'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  5: 'ceph'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  6: '--setgroup'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  7: 'ceph'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  8: '--default-log-to-file=false'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  9: '--default-log-to-journald=true'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr respawn  exe_path /proc/self/exe
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:30.917+0000 7f6e378db140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:42:30 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 04:42:30 np0005595445 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 04:42:30 np0005595445 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 04:42:31 np0005595445 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:31.023+0000 7f6e378db140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:42:31 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 04:42:31 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.d scrub starts
Jan 26 04:42:31 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.d scrub ok
Jan 26 04:42:31 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 04:42:31 np0005595445 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:31 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 04:42:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:31.896+0000 7f6e378db140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.c deep-scrub starts
Jan 26 04:42:32 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.c deep-scrub ok
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.567+0000 7f6e378db140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:  from numpy import show_config as show_numpy_config
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.743+0000 7f6e378db140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.816+0000 7f6e378db140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:32.962+0000 7f6e378db140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:42:32 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 04:42:33 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Jan 26 04:42:33 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:33.997+0000 7f6e378db140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:42:33 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 04:42:34 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.f deep-scrub starts
Jan 26 04:42:34 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.f deep-scrub ok
Jan 26 04:42:34 np0005595445 systemd[1]: Stopping User Manager for UID 42477...
Jan 26 04:42:34 np0005595445 systemd[72713]: Activating special unit Exit the Session...
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped target Main User Target.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped target Basic System.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped target Paths.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped target Sockets.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped target Timers.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 04:42:34 np0005595445 systemd[72713]: Closed D-Bus User Message Bus Socket.
Jan 26 04:42:34 np0005595445 systemd[72713]: Stopped Create User's Volatile Files and Directories.
Jan 26 04:42:34 np0005595445 systemd[72713]: Removed slice User Application Slice.
Jan 26 04:42:34 np0005595445 systemd[72713]: Reached target Shutdown.
Jan 26 04:42:34 np0005595445 systemd[72713]: Finished Exit the Session.
Jan 26 04:42:34 np0005595445 systemd[72713]: Reached target Exit the Session.
Jan 26 04:42:34 np0005595445 systemd[1]: user@42477.service: Deactivated successfully.
Jan 26 04:42:34 np0005595445 systemd[1]: Stopped User Manager for UID 42477.
Jan 26 04:42:34 np0005595445 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.229+0000 7f6e378db140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 26 04:42:34 np0005595445 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 26 04:42:34 np0005595445 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 26 04:42:34 np0005595445 systemd[1]: Removed slice User Slice of UID 42477.
Jan 26 04:42:34 np0005595445 systemd[1]: user-42477.slice: Consumed 1min 43.767s CPU time.
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.314+0000 7f6e378db140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.381+0000 7f6e378db140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.458+0000 7f6e378db140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.529+0000 7f6e378db140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 04:42:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:34.895+0000 7f6e378db140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:42:34 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 04:42:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:35.003+0000 7f6e378db140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:35 np0005595445 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:42:35 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 04:42:35 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.d scrub starts
Jan 26 04:42:35 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.d scrub ok
Jan 26 04:42:35 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 04:42:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:35.504+0000 7f6e378db140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:35 np0005595445 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:42:35 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 04:42:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:36 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.9 deep-scrub starts
Jan 26 04:42:36 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.9 deep-scrub ok
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.103+0000 7f6e378db140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.175+0000 7f6e378db140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.262+0000 7f6e378db140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.415+0000 7f6e378db140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.494+0000 7f6e378db140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.646+0000 7f6e378db140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 04:42:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 26 04:42:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:36.877+0000 7f6e378db140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:42:36 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 04:42:37 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.10 deep-scrub starts
Jan 26 04:42:37 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.10 deep-scrub ok
Jan 26 04:42:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:37.140+0000 7f6e378db140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 04:42:37 np0005595445 systemd-logind[783]: New session 34 of user ceph-admin.
Jan 26 04:42:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:42:37.206+0000 7f6e378db140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:42:37 np0005595445 systemd[1]: Created slice User Slice of UID 42477.
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x557726af9860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 04:42:37 np0005595445 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 26 04:42:37 np0005595445 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 26 04:42:37 np0005595445 systemd[1]: Starting User Manager for UID 42477...
Jan 26 04:42:37 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 04:42:37 np0005595445 systemd[84066]: Queued start job for default target Main User Target.
Jan 26 04:42:37 np0005595445 systemd[84066]: Created slice User Application Slice.
Jan 26 04:42:37 np0005595445 systemd[84066]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 04:42:37 np0005595445 systemd[84066]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 04:42:37 np0005595445 systemd[84066]: Reached target Paths.
Jan 26 04:42:37 np0005595445 systemd[84066]: Reached target Timers.
Jan 26 04:42:37 np0005595445 systemd[84066]: Starting D-Bus User Message Bus Socket...
Jan 26 04:42:37 np0005595445 systemd[84066]: Starting Create User's Volatile Files and Directories...
Jan 26 04:42:37 np0005595445 systemd[84066]: Listening on D-Bus User Message Bus Socket.
Jan 26 04:42:37 np0005595445 systemd[84066]: Reached target Sockets.
Jan 26 04:42:37 np0005595445 systemd[84066]: Finished Create User's Volatile Files and Directories.
Jan 26 04:42:37 np0005595445 systemd[84066]: Reached target Basic System.
Jan 26 04:42:37 np0005595445 systemd[1]: Started User Manager for UID 42477.
Jan 26 04:42:37 np0005595445 systemd[84066]: Reached target Main User Target.
Jan 26 04:42:37 np0005595445 systemd[84066]: Startup finished in 125ms.
Jan 26 04:42:37 np0005595445 systemd[1]: Started Session 34 of User ceph-admin.
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e2 new map
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e2 print_map#012e2#012btime 2026-01-26T09:42:37:723366+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:37.723319+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Jan 26 04:42:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 26 04:42:38 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 26 04:42:38 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 26 04:42:38 np0005595445 podman[84204]: 2026-01-26 09:42:38.14791446 +0000 UTC m=+0.055282932 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 26 04:42:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:38 np0005595445 podman[84204]: 2026-01-26 09:42:38.238436257 +0000 UTC m=+0.145804749 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:42:39 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 26 04:42:39 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:37] ENGINE Bus STARTING
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Serving on https://192.168.122.100:7150
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Client ('192.168.122.100', 37614) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Serving on http://192.168.122.100:8765
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:42:38] ENGINE Bus STARTED
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 26 04:42:40 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 26 04:42:40 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 58 pg[12.0( empty local-lis/les=0/0 n=0 ec=58/58 lis/c=0/0 les/c/f=0/0/0 sis=58) [1] r=0 lpr=58 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:42:41 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 26 04:42:41 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 26 04:42:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 26 04:42:41 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 59 pg[12.0( empty local-lis/les=58/59 n=0 ec=58/58 lis/c=0/0 les/c/f=0/0/0 sis=58) [1] r=0 lpr=58 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:42:42 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Jan 26 04:42:42 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 26 04:42:43 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:43 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.13 deep-scrub starts
Jan 26 04:42:43 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.13 deep-scrub ok
Jan 26 04:42:43 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:43 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:43 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:43 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:43 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:43 np0005595445 systemd[1]: Starting Ceph node-exporter.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:42:43 np0005595445 bash[85564]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 26 04:42:44 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 26 04:42:44 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 26 04:42:44 np0005595445 ceph-mon[80107]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 26 04:42:44 np0005595445 bash[85564]: Getting image source signatures
Jan 26 04:42:44 np0005595445 bash[85564]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 26 04:42:44 np0005595445 bash[85564]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 26 04:42:44 np0005595445 bash[85564]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 26 04:42:45 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 26 04:42:45 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 26 04:42:45 np0005595445 bash[85564]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 26 04:42:45 np0005595445 bash[85564]: Writing manifest to image destination
Jan 26 04:42:45 np0005595445 podman[85564]: 2026-01-26 09:42:45.395366304 +0000 UTC m=+1.446981141 container create 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:42:45 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba8c33c607fbcb65ee9cea632dfe2337902fa89cd4980b43909b2c88fc1600c/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 26 04:42:45 np0005595445 podman[85564]: 2026-01-26 09:42:45.448808443 +0000 UTC m=+1.500423330 container init 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:42:45 np0005595445 podman[85564]: 2026-01-26 09:42:45.453902505 +0000 UTC m=+1.505517342 container start 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:42:45 np0005595445 bash[85564]: 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a
Jan 26 04:42:45 np0005595445 podman[85564]: 2026-01-26 09:42:45.381200911 +0000 UTC m=+1.432815768 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.460Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.460Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.461Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=arp
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=bcache
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=bonding
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=cpu
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=dmi
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=edac
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=entropy
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=filefd
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netclass
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netdev
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=netstat
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nfs
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=nvme
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=os
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=pressure
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=rapl
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=selinux
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=softnet
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=stat
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=textfile
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=time
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=uname
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=xfs
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.462Z caller=node_exporter.go:117 level=info collector=zfs
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.464Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 26 04:42:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1[85640]: ts=2026-01-26T09:42:45.464Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 26 04:42:45 np0005595445 systemd[1]: Started Ceph node-exporter.compute-1 for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:42:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:45 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/234657791' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 26 04:42:45 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/234657791' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 26 04:42:46 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Jan 26 04:42:46 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Jan 26 04:42:46 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:46 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:46 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:46 np0005595445 ceph-mon[80107]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 26 04:42:47 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.16 deep-scrub starts
Jan 26 04:42:47 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.16 deep-scrub ok
Jan 26 04:42:48 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 26 04:42:48 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 26 04:42:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:42:49 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 26 04:42:49 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 26 04:42:49 np0005595445 ceph-mon[80107]: from='client.? 192.168.122.100:0/113232953' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 26 04:42:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 26 04:42:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 26 04:42:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 26 04:42:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 26 04:42:51 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 26 04:42:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 26 04:42:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1b deep-scrub starts
Jan 26 04:42:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 5.1b deep-scrub ok
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.qkzyup", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.qkzyup", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:54 np0005595445 ceph-mon[80107]: Deploying daemon rgw.rgw.compute-0.qkzyup on compute-0
Jan 26 04:42:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 26 04:42:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 26 04:42:55 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:55 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:55 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:55 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.a scrub starts
Jan 26 04:42:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.a scrub ok
Jan 26 04:42:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:42:56 np0005595445 ceph-mon[80107]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 26 04:42:56 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:56 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zprrum", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 04:42:56 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.zprrum", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 04:42:56 np0005595445 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-2.zprrum on compute-2
Jan 26 04:42:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Jan 26 04:42:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.zhqpiu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.zhqpiu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e3 new map
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e3 print_map#012e3#012btime 2026-01-26T09:42:57:034497+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:37.723319+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.zprrum{-1:24220} state up:standby seq 1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e4 new map
Jan 26 04:42:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e4 print_map#012e4#012btime 2026-01-26T09:42:57:061062+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:57.061055+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.zprrum{0:24220} state up:creating seq 1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-0.zhqpiu on compute-0
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: daemon mds.cephfs.compute-2.zprrum assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: Cluster is now healthy
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: daemon mds.cephfs.compute-2.zprrum is now active in filesystem cephfs as rank 0
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e5 new map
Jan 26 04:42:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e5 print_map#012e5#012btime 2026-01-26T09:42:58:074031+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:58.074029+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.04377294 +0000 UTC m=+0.039613578 container create d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rbkelk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.rbkelk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 26 04:42:59 np0005595445 systemd[1]: Started libpod-conmon-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope.
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e6 new map
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e6 print_map#012e6#012btime 2026-01-26T09:42:59:090306+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:58.074029+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.026401728 +0000 UTC m=+0.022242406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e7 new map
Jan 26 04:42:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e7 print_map#012e7#012btime 2026-01-26T09:42:59:119781+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:42:58.074029+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 2 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:42:59 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.156527712 +0000 UTC m=+0.152368390 container init d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.164425351 +0000 UTC m=+0.160265999 container start d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.167462385 +0000 UTC m=+0.163303083 container attach d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:42:59 np0005595445 funny_wilson[85759]: 167 167
Jan 26 04:42:59 np0005595445 systemd[1]: libpod-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope: Deactivated successfully.
Jan 26 04:42:59 np0005595445 conmon[85759]: conmon d1b15216db57108421fe <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope/container/memory.events
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.174013996 +0000 UTC m=+0.169854644 container died d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:42:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay-504c794a301c97feb6f5d9e96dce7e817095e716761909643c10cdf3e8f63064-merged.mount: Deactivated successfully.
Jan 26 04:42:59 np0005595445 podman[85742]: 2026-01-26 09:42:59.219961079 +0000 UTC m=+0.215801737 container remove d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_wilson, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:42:59 np0005595445 systemd[1]: libpod-conmon-d1b15216db57108421fe62ffe42b7031ff968e38fa34850d834e3311659e2a31.scope: Deactivated successfully.
Jan 26 04:42:59 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:59 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:59 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:59 np0005595445 systemd[1]: Reloading.
Jan 26 04:42:59 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:42:59 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:42:59 np0005595445 systemd[1]: Starting Ceph mds.cephfs.compute-1.rbkelk for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:43:00 np0005595445 podman[85900]: 2026-01-26 09:43:00.055235279 +0000 UTC m=+0.042345883 container create 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 26 04:43:00 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:00 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:00 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:00 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1698a2e20032296d1ccb8212699fab3c67b3d6b3c7b2f303b7c12ebd42dd93c/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.rbkelk supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:00 np0005595445 podman[85900]: 2026-01-26 09:43:00.116685341 +0000 UTC m=+0.103795985 container init 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 26 04:43:00 np0005595445 podman[85900]: 2026-01-26 09:43:00.122034929 +0000 UTC m=+0.109145523 container start 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 26 04:43:00 np0005595445 bash[85900]: 402cc330e5f0ab6e9d9a513bc9f6c24dd12bc82eabc190b23d3cfb24ed7852db
Jan 26 04:43:00 np0005595445 podman[85900]: 2026-01-26 09:43:00.038583218 +0000 UTC m=+0.025693832 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:43:00 np0005595445 ceph-mon[80107]: Deploying daemon mds.cephfs.compute-1.rbkelk on compute-1
Jan 26 04:43:00 np0005595445 systemd[1]: Started Ceph mds.cephfs.compute-1.rbkelk for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:43:00 np0005595445 ceph-mds[85919]: set uid:gid to 167:167 (ceph:ceph)
Jan 26 04:43:00 np0005595445 ceph-mds[85919]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 26 04:43:00 np0005595445 ceph-mds[85919]: main not setting numa affinity
Jan 26 04:43:00 np0005595445 ceph-mds[85919]: pidfile_write: ignore empty --pid-file
Jan 26 04:43:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mds-cephfs-compute-1-rbkelk[85915]: starting mds.cephfs.compute-1.rbkelk at 
Jan 26 04:43:00 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 7 from mon.2
Jan 26 04:43:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.057868744 +0000 UTC m=+0.042620652 container create f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 26 04:43:01 np0005595445 systemd[1]: Started libpod-conmon-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope.
Jan 26 04:43:01 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.041098119 +0000 UTC m=+0.025850037 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.152355599 +0000 UTC m=+0.137107537 container init f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.1578089 +0000 UTC m=+0.142560798 container start f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.1613667 +0000 UTC m=+0.146118598 container attach f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 26 04:43:01 np0005595445 clever_chatterjee[86044]: 167 167
Jan 26 04:43:01 np0005595445 systemd[1]: libpod-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope: Deactivated successfully.
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.164857736 +0000 UTC m=+0.149609644 container died f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:43:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay-02d9d9ad2b40affc72d3464139b44c2e6e27a5882656f0071e6df85d150186ba-merged.mount: Deactivated successfully.
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.0.0.compute-1.thyhvc
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.0.0.compute-1.thyhvc-rgw
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.thyhvc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:43:01 np0005595445 podman[86028]: 2026-01-26 09:43:01.202467297 +0000 UTC m=+0.187219195 container remove f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_chatterjee, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:43:01 np0005595445 systemd[1]: libpod-conmon-f9a7a0a589cf141e7c263f7fefa150839991b3eb1036543f6edec3aa4f11a0d2.scope: Deactivated successfully.
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e8 new map
Jan 26 04:43:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e8 print_map#012e8#012btime 2026-01-26T09:43:01:199992+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:43:01.101127+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:43:01 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 8 from mon.2
Jan 26 04:43:01 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Monitors have assigned me to become a standby
Jan 26 04:43:01 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:01 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:01 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:01 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:01 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:01 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:01 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:43:01 np0005595445 podman[86185]: 2026-01-26 09:43:01.96164242 +0000 UTC m=+0.033980382 container create 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1)
Jan 26 04:43:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:01 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:02 np0005595445 podman[86185]: 2026-01-26 09:43:02.005069913 +0000 UTC m=+0.077407875 container init 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Jan 26 04:43:02 np0005595445 podman[86185]: 2026-01-26 09:43:02.010111113 +0000 UTC m=+0.082449075 container start 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:43:02 np0005595445 bash[86185]: 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36
Jan 26 04:43:02 np0005595445 podman[86185]: 2026-01-26 09:43:01.947318263 +0000 UTC m=+0.019656255 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:43:02 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:43:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: Bind address in nfs.cephfs.0.0.compute-1.thyhvc's ganesha conf is defaulting to empty
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: Deploying daemon nfs.cephfs.0.0.compute-1.thyhvc on compute-1
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 04:43:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 04:43:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e9 new map
Jan 26 04:43:03 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.1.0.compute-2.najyrz
Jan 26 04:43:03 np0005595445 ceph-mon[80107]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 26 04:43:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e9 print_map#012e9#012btime 2026-01-26T09:43:03:225984+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:43:01.101127+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:43:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e10 new map
Jan 26 04:43:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).mds e10 print_map#012e10#012btime 2026-01-26T09:43:04:246912+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-26T09:42:37.723319+0000#012modified#0112026-01-26T09:43:01.101127+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24220}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24220 members: 24220#012[mds.cephfs.compute-2.zprrum{0:24220} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1987962990,v1:192.168.122.102:6805/1987962990] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.zhqpiu{-1:14568} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4011782606,v1:192.168.122.100:6807/4011782606] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.rbkelk{-1:24194} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4143393925,v1:192.168.122.101:6805/4143393925] compat {c=[1],r=[1],i=[1fff]}]
Jan 26 04:43:04 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Updating MDS map to version 10 from mon.2
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:43:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:05 : epoch 69773726 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:43:05 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 04:43:05 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 04:43:05 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:43:05 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.najyrz-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:43:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:06 np0005595445 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 04:43:06 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.1.0.compute-2.najyrz-rgw
Jan 26 04:43:06 np0005595445 ceph-mon[80107]: Bind address in nfs.cephfs.1.0.compute-2.najyrz's ganesha conf is defaulting to empty
Jan 26 04:43:06 np0005595445 ceph-mon[80107]: Deploying daemon nfs.cephfs.1.0.compute-2.najyrz on compute-2
Jan 26 04:43:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:43:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:43:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:07 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.2.0.compute-0.zfynkw
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 26 04:43:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 26 04:43:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:43:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: Rados config object exists: conf-nfs.cephfs
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: Creating key for client.nfs.cephfs.2.0.compute-0.zfynkw-rgw
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.zfynkw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: Bind address in nfs.cephfs.2.0.compute-0.zfynkw's ganesha conf is defaulting to empty
Jan 26 04:43:11 np0005595445 ceph-mon[80107]: Deploying daemon nfs.cephfs.2.0.compute-0.zfynkw on compute-0
Jan 26 04:43:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:43:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:13 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:43:13 np0005595445 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-1.nsxfyf on compute-1
Jan 26 04:43:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.811445735 +0000 UTC m=+3.195279905 container create cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.795727208 +0000 UTC m=+3.179561398 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 04:43:15 np0005595445 systemd[1]: Started libpod-conmon-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope.
Jan 26 04:43:15 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.902710862 +0000 UTC m=+3.286545062 container init cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.910107173 +0000 UTC m=+3.293941363 container start cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.913557127 +0000 UTC m=+3.297391317 container attach cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 trusting_wescoff[86470]: 0 0
Jan 26 04:43:15 np0005595445 systemd[1]: libpod-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope: Deactivated successfully.
Jan 26 04:43:15 np0005595445 conmon[86470]: conmon cd82b8b685487e6869f6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope/container/memory.events
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.916519228 +0000 UTC m=+3.300353398 container died cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 systemd[1]: var-lib-containers-storage-overlay-4fb6e6ec22f4f145ed668c0992bc0d2e45431868371b875bac4ac207ab24479c-merged.mount: Deactivated successfully.
Jan 26 04:43:15 np0005595445 podman[86346]: 2026-01-26 09:43:15.951387017 +0000 UTC m=+3.335221187 container remove cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772 (image=quay.io/ceph/haproxy:2.3, name=trusting_wescoff)
Jan 26 04:43:15 np0005595445 systemd[1]: libpod-conmon-cd82b8b685487e6869f6514c1b202d116d53f159dcea2ee92b192ddfd4e95772.scope: Deactivated successfully.
Jan 26 04:43:16 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:16 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:16 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:16 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:16 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:16 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:16 np0005595445 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.nsxfyf for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:43:16 np0005595445 podman[86614]: 2026-01-26 09:43:16.704856203 +0000 UTC m=+0.039610290 container create 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:43:16 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c30f8b227333eb121817e81f5fdeba5332091ff0a4a2104b1df0d5d22f450f1/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:16 np0005595445 podman[86614]: 2026-01-26 09:43:16.754335501 +0000 UTC m=+0.089089618 container init 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:43:16 np0005595445 podman[86614]: 2026-01-26 09:43:16.759449281 +0000 UTC m=+0.094203368 container start 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:43:16 np0005595445 bash[86614]: 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29
Jan 26 04:43:16 np0005595445 podman[86614]: 2026-01-26 09:43:16.686866984 +0000 UTC m=+0.021621101 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 26 04:43:16 np0005595445 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.nsxfyf for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:43:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094316 (2) : New worker #1 (4) forked
Jan 26 04:43:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:17 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:17 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:17 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:17 np0005595445 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-0.eucyze on compute-0
Jan 26 04:43:17 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:21 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:21 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:21 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:21 np0005595445 ceph-mon[80107]: Deploying daemon haproxy.nfs.cephfs.compute-2.rbycaf on compute-2
Jan 26 04:43:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001bd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 04:43:25 np0005595445 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-1.wvnxoh on compute-1
Jan 26 04:43:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.264663092 +0000 UTC m=+3.012077286 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.281560572 +0000 UTC m=+3.028974736 container create bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, release=1793, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.buildah.version=1.28.2, com.redhat.component=keepalived-container)
Jan 26 04:43:28 np0005595445 systemd[1]: Started libpod-conmon-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope.
Jan 26 04:43:28 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.35381857 +0000 UTC m=+3.101232754 container init bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, version=2.2.4, io.buildah.version=1.28.2, io.openshift.expose-services=, name=keepalived, com.redhat.component=keepalived-container, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64)
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.364330257 +0000 UTC m=+3.111744421 container start bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, release=1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9)
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.367902664 +0000 UTC m=+3.115316848 container attach bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.expose-services=, distribution-scope=public, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 26 04:43:28 np0005595445 elegant_dijkstra[86835]: 0 0
Jan 26 04:43:28 np0005595445 systemd[1]: libpod-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope: Deactivated successfully.
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.371435571 +0000 UTC m=+3.118849735 container died bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793)
Jan 26 04:43:28 np0005595445 systemd[1]: var-lib-containers-storage-overlay-dff235b961c542b0cea5919d5c474015d41b6a32017cd3bea6ea6ea8cdaddf45-merged.mount: Deactivated successfully.
Jan 26 04:43:28 np0005595445 podman[86740]: 2026-01-26 09:43:28.423993312 +0000 UTC m=+3.171407476 container remove bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8 (image=quay.io/ceph/keepalived:2.2.4, name=elegant_dijkstra, io.openshift.expose-services=, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9)
Jan 26 04:43:28 np0005595445 systemd[1]: libpod-conmon-bffff4c47e045cdc9ecc4e594dd6b2800473993b8d114fb10e12cd08ac00b4a8.scope: Deactivated successfully.
Jan 26 04:43:28 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:28 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:28 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:28 np0005595445 systemd[1]: Reloading.
Jan 26 04:43:28 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:43:28 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:43:29 np0005595445 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.wvnxoh for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:43:29 np0005595445 podman[86979]: 2026-01-26 09:43:29.328238635 +0000 UTC m=+0.048363098 container create 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, version=2.2.4, vcs-type=git, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=keepalived)
Jan 26 04:43:29 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8357d20c9b2e4fd9989a87363535dc76d373866c082f1d07506c1d70a54e2f5f/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:43:29 np0005595445 podman[86979]: 2026-01-26 09:43:29.386777229 +0000 UTC m=+0.106901712 container init 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.buildah.version=1.28.2, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc.)
Jan 26 04:43:29 np0005595445 podman[86979]: 2026-01-26 09:43:29.393655437 +0000 UTC m=+0.113779900 container start 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1793, io.buildah.version=1.28.2, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived)
Jan 26 04:43:29 np0005595445 bash[86979]: 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268
Jan 26 04:43:29 np0005595445 podman[86979]: 2026-01-26 09:43:29.305384172 +0000 UTC m=+0.025508655 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 26 04:43:29 np0005595445 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.wvnxoh for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Starting VRRP child process, pid=4
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: Startup complete
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: (VI_0) Entering BACKUP STATE (init)
Jan 26 04:43:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:29 2026: VRRP_Script(check_backend) succeeded
Jan 26 04:43:30 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:30 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:30 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a0001d50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40089d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:31 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 04:43:31 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 04:43:31 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 04:43:31 np0005595445 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-0.orrhyj on compute-0
Jan 26 04:43:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:33 2026: (VI_0) Entering MASTER STATE
Jan 26 04:43:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940016c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 26 04:43:36 np0005595445 ceph-mon[80107]: Deploying daemon keepalived.nfs.cephfs.compute-2.ovafut on compute-2
Jan 26 04:43:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.373870) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617374129, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6974, "num_deletes": 257, "total_data_size": 18574666, "memory_usage": 19565664, "flush_reason": "Manual Compaction"}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617472296, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11825899, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6979, "table_properties": {"data_size": 11799325, "index_size": 16859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 83620, "raw_average_key_size": 24, "raw_value_size": 11732902, "raw_average_value_size": 3395, "num_data_blocks": 745, "num_entries": 3455, "num_filter_entries": 3455, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 1769420465, "file_creation_time": 1769420617, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 98473 microseconds, and 34322 cpu microseconds.
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.472383) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11825899 bytes OK
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.472407) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473836) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473852) EVENT_LOG_v1 {"time_micros": 1769420617473847, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.473868) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18537159, prev total WAL file size 18537159, number of live WAL files 2.
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.476793) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1773B)]
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617476866, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11827672, "oldest_snapshot_seqno": -1}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3202 keys, 11822698 bytes, temperature: kUnknown
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617566792, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11822698, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11796640, "index_size": 16924, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8069, "raw_key_size": 80115, "raw_average_key_size": 25, "raw_value_size": 11733307, "raw_average_value_size": 3664, "num_data_blocks": 746, "num_entries": 3202, "num_filter_entries": 3202, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420617, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.567014) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11822698 bytes
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.568956) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.4 rd, 131.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.3, 0.0 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3460, records dropped: 258 output_compression: NoCompression
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.568984) EVENT_LOG_v1 {"time_micros": 1769420617568971, "job": 4, "event": "compaction_finished", "compaction_time_micros": 90007, "compaction_time_cpu_micros": 22555, "output_level": 6, "num_output_files": 1, "total_output_size": 11822698, "num_input_records": 3460, "num_output_records": 3202, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617571043, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420617571100, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 26 04:43:37 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:37.476735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:37 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 26 04:43:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh[86994]: Mon Jan 26 09:43:37 2026: (VI_0) Entering BACKUP STATE
Jan 26 04:43:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:43:38 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:43:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 26 04:43:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00031e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:43:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:43:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:39 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:39 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 26 04:43:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40096e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.885763) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621885816, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 385, "num_deletes": 251, "total_data_size": 376159, "memory_usage": 384928, "flush_reason": "Manual Compaction"}
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621890957, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 248024, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6984, "largest_seqno": 7364, "table_properties": {"data_size": 245642, "index_size": 482, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5757, "raw_average_key_size": 18, "raw_value_size": 240815, "raw_average_value_size": 759, "num_data_blocks": 21, "num_entries": 317, "num_filter_entries": 317, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420617, "oldest_key_time": 1769420617, "file_creation_time": 1769420621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5224 microseconds, and 1353 cpu microseconds.
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.890993) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 248024 bytes OK
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.891011) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892417) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892431) EVENT_LOG_v1 {"time_micros": 1769420621892427, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892446) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 373573, prev total WAL file size 373573, number of live WAL files 2.
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892815) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(242KB)], [15(11MB)]
Jan 26 04:43:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420621892853, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12070722, "oldest_snapshot_seqno": -1}
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3000 keys, 10943808 bytes, temperature: kUnknown
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622036680, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 10943808, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10919746, "index_size": 15441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7557, "raw_key_size": 76992, "raw_average_key_size": 25, "raw_value_size": 10860461, "raw_average_value_size": 3620, "num_data_blocks": 674, "num_entries": 3000, "num_filter_entries": 3000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420621, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.036958) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 10943808 bytes
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.246158) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 83.9 rd, 76.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 11.3 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(92.8) write-amplify(44.1) OK, records in: 3519, records dropped: 519 output_compression: NoCompression
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.246230) EVENT_LOG_v1 {"time_micros": 1769420622246204, "job": 6, "event": "compaction_finished", "compaction_time_micros": 143945, "compaction_time_cpu_micros": 23740, "output_level": 6, "num_output_files": 1, "total_output_size": 10943808, "num_input_records": 3519, "num_output_records": 3000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622246547, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420622249101, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:41.892765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249231) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:43:42.249237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:43:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 26 04:43:42 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 65 pg[10.0( v 60'48 (0'0,60'48] local-lis/les=50/51 n=8 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65 pruub=9.585060120s) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 60'47 active pruub 210.508377075s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:42 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 65 pg[10.0( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65 pruub=9.585060120s) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 0'0 unknown pruub 210.508377075s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1b( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.7( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.12( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.11( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.10( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1f( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1e( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1d( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1a( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.19( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1c( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.6( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.18( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.5( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.4( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.3( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.b( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.8( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.d( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.a( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.9( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.c( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.f( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.e( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.2( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.14( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.13( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.15( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.16( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.17( v 60'48 lc 0'0 (0'0,60'48] local-lis/les=50/51 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.7( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1a( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1d( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.6( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.3( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.9( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.a( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1c( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.c( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.d( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.0( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=50/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 60'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.14( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.15( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.16( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 66 pg[10.17( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=50/50 les/c/f=51/51/0 sis=65) [1] r=0 lpr=65 pi=[50,65)/1 crt=60'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:44 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Jan 26 04:43:44 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Jan 26 04:43:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 26 04:43:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 67 pg[12.0( v 60'46 (0'0,60'46] local-lis/les=58/59 n=5 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67 pruub=8.900333405s) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 60'45 mlcod 60'45 active pruub 211.830657959s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 67 pg[12.0( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67 pruub=8.900333405s) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 60'45 mlcod 0'0 unknown pruub 211.830657959s@ mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:44 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1).collection(12.0_head 0x55d1702b86c0) operator()   moving buffer(0x55d170cba208 space 0x55d170cfe350 0x0~1000 clean)
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: Regenerating cephadm self-signed grafana TLS certificates
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: Deploying daemon grafana.compute-0 on compute-0
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:44 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 26 04:43:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.11( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.13( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.12( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.10( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.15( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.7( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.4( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.6( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.9( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.8( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.a( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.c( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.f( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.b( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.e( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.5( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.2( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.3( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.d( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1e( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1f( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1c( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1a( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1b( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.18( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.19( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.16( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.14( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.17( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1( v 60'46 (0'0,60'46] local-lis/les=58/59 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1d( v 60'46 lc 0'0 (0'0,60'46] local-lis/les=58/59 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'45 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:45 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.15( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.f( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.10( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.5( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1f( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.0( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=58/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 60'45 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.16( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.14( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 68 pg[12.d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=58/58 les/c/f=59/59/0 sis=67) [1] r=0 lpr=67 pi=[58,67)/1 crt=60'46 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:46 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 26 04:43:46 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 26 04:43:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:47 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1f deep-scrub starts
Jan 26 04:43:47 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1f deep-scrub ok
Jan 26 04:43:48 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Jan 26 04:43:48 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Jan 26 04:43:48 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a400a3f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.10( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.211609840s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=68'47 lcod 68'48 mlcod 68'48 active pruub 219.969894409s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.15( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.197128296s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955444336s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.10( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.211562157s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=68'47 lcod 68'48 mlcod 0'0 unknown NOTIFY pruub 219.969894409s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.15( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.197095871s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955444336s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204886436s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.963439941s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.14( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196805000s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955413818s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204831123s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.963439941s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.13( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204850197s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.963439941s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.12( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.204812050s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.963439941s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.14( v 66'51 (0'0,66'51] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196767807s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955413818s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196599960s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955383301s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.13( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196578979s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955383301s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196598053s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955474854s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210849762s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969787598s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.2( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196548462s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955474854s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.4( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210830688s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969787598s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196307182s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955322266s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210350990s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969390869s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196286201s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955322266s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210316658s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969497681s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.6( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210298538s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969497681s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196280479s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955535889s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.f( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.196266174s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955535889s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210173607s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969512939s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.9( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210156441s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969512939s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210277557s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969680786s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.8( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210260391s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969680786s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210008621s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969558716s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.a( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209998131s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969558716s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.7( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210334778s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969390869s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210254669s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969909668s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.b( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210244179s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969909668s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210129738s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969818115s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210205078s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969924927s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195512772s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955230713s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.8( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195491791s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955230713s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210190773s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969924927s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.210109711s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969818115s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.3( v 66'51 (0'0,66'51] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195200920s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 66'50 active pruub 217.955062866s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.3( v 66'51 (0'0,66'51] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195178032s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 66'50 mlcod 0'0 unknown NOTIFY pruub 217.955062866s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195085526s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955047607s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.4( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.195068359s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209959984s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969940186s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.2( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209941864s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969940186s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209921837s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970046997s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.3( v 60'46 (0'0,60'46] local-lis/les=67/68 n=1 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209904671s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970046997s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194754601s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955001831s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209153175s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969436646s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.11( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209136009s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969436646s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.5( v 60'48 (0'0,60'48] local-lis/les=65/66 n=1 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194731712s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194509506s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954986572s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.18( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194488525s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954986572s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209237099s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.969985962s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1e( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209216118s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.969985962s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.194181442s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954910278s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209335327s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970230103s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1c( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209313393s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970230103s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1a( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209171295s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 68'48 mlcod 68'48 active pruub 219.970169067s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1a( v 68'49 (0'0,68'49] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209129333s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 68'48 mlcod 0'0 unknown NOTIFY pruub 219.970169067s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193212509s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954437256s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209067345s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970306396s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.18( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.209049225s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970306396s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1e( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193169594s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954437256s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208838463s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970321655s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.19( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208822250s) [0] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970321655s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193565369s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.955093384s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.19( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193413734s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954910278s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192659378s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.954299927s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.11( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192640305s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.954299927s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208807945s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970520020s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.17( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208787918s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970520020s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192235947s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.953994751s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.12( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.192203522s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.953994751s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208681107s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 active pruub 219.970489502s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.10( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.193547249s) [2] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.955093384s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[12.1d( v 60'46 (0'0,60'46] local-lis/les=67/68 n=0 ec=67/58 lis/c=67/67 les/c/f=68/68/0 sis=69 pruub=12.208670616s) [2] r=-1 lpr=69 pi=[67,69)/1 crt=60'46 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 219.970489502s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.154850006s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 active pruub 217.916732788s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[10.1b( v 60'48 (0'0,60'48] local-lis/les=65/66 n=0 ec=65/50 lis/c=65/65 les/c/f=66/66/0 sis=69 pruub=10.154829025s) [0] r=-1 lpr=69 pi=[65,69)/1 crt=60'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.916732788s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 26 04:43:49 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.12( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.12( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.10( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.7( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.4( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.4( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.14( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.17( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.5( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.8( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.f( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1e( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1d( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1c( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.18( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1b( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[11.1a( empty local-lis/les=0/0 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.19( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.1b( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:49 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 69 pg[8.14( empty local-lis/les=0/0 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1a( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.10( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 26 04:43:50 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1e( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.19( v 47'9 lc 0'0 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.12( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1c( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.1b( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1b( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.7( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.4( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.4( v 47'9 (0'0,47'9] local-lis/les=69/70 n=1 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.5( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.18( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1d( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.1( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.8( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.f( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.17( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.14( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[8.14( v 47'9 (0'0,47'9] local-lis/les=69/70 n=0 ec=63/46 lis/c=63/63 les/c/f=65/65/0 sis=69) [1] r=0 lpr=69 pi=[63,69)/1 crt=47'9 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:50 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 70 pg[11.12( empty local-lis/les=69/70 n=0 ec=66/52 lis/c=66/66 les/c/f=67/67/0 sis=69) [1] r=0 lpr=69 pi=[66,69)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:43:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Jan 26 04:43:51 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Jan 26 04:43:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 26 04:43:51 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 26 04:43:51 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 26 04:43:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.15 scrub starts
Jan 26 04:43:52 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.15 scrub ok
Jan 26 04:43:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: Deploying daemon haproxy.rgw.default.compute-0.ovxbdp on compute-0
Jan 26 04:43:52 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 26 04:43:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.0 scrub starts
Jan 26 04:43:53 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.0 scrub ok
Jan 26 04:43:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 26 04:43:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 26 04:43:54 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 26 04:43:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:54 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 26 04:43:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 26 04:43:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 26 04:43:55 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 26 04:43:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:43:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 26 04:43:56 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 26 04:43:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 26 04:43:56 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 26 04:43:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:43:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000080s ======
Jan 26 04:43:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:43:56.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000080s
Jan 26 04:43:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:57 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 26 04:43:57 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: Deploying daemon haproxy.rgw.default.compute-2.yyinob on compute-2
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 26 04:43:58 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.f scrub starts
Jan 26 04:43:58 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.f scrub ok
Jan 26 04:43:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:43:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:43:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:43:58.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:43:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:43:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:43:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:43:58.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:43:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:43:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: Deploying daemon keepalived.rgw.default.compute-0.dhkprh on compute-0
Jan 26 04:43:58 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 26 04:43:58 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.d deep-scrub starts
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.d deep-scrub ok
Jan 26 04:43:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=76) [1] r=0 lpr=76 pi=[63,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.b scrub starts
Jan 26 04:43:59 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 26 04:43:59 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.b scrub ok
Jan 26 04:44:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.16( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.6( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 77 pg[9.1e( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=77) [1]/[0] r=-1 lpr=77 pi=[63,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:44:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:00.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:44:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:44:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:00.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:44:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.d deep-scrub starts
Jan 26 04:44:00 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.d deep-scrub ok
Jan 26 04:44:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:01 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 26 04:44:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 26 04:44:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:02.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:02 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 26 04:44:02 np0005595445 ceph-mon[80107]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 26 04:44:02 np0005595445 ceph-mon[80107]: Deploying daemon keepalived.rgw.default.compute-2.djgvpg on compute-2
Jan 26 04:44:02 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 26 04:44:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:02 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 79 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:44:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:02.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:44:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478002f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 26 04:44:04 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.6( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:04 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=6 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:04 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:04 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 80 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=77/63 les/c/f=78/65/0 sis=79) [1] r=0 lpr=79 pi=[63,79)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:04 np0005595445 ceph-mon[80107]: Deploying daemon prometheus.compute-0 on compute-0
Jan 26 04:44:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:04.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480002140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:04.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:04 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 26 04:44:05 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 26 04:44:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 26 04:44:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:05 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 26 04:44:05 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.6 scrub starts
Jan 26 04:44:05 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.6 scrub ok
Jan 26 04:44:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:06.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 26 04:44:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:44:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:06.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:44:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:06 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Jan 26 04:44:06 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Jan 26 04:44:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 26 04:44:07 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.5 scrub starts
Jan 26 04:44:07 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.5 scrub ok
Jan 26 04:44:08 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:08.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 26 04:44:08 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.0 scrub starts
Jan 26 04:44:08 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.0 scrub ok
Jan 26 04:44:09 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 26 04:44:09 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 26 04:44:09 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Jan 26 04:44:09 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Jan 26 04:44:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 26 04:44:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:10.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:10.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:10 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1f deep-scrub starts
Jan 26 04:44:10 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1f deep-scrub ok
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 26 04:44:11 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 86 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=86) [1] r=0 lpr=86 pi=[63,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:11 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 86 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=86) [1] r=0 lpr=86 pi=[63,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 26 04:44:11 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 26 04:44:11 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Jan 26 04:44:11 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 26 04:44:12 np0005595445 systemd[1]: session-34.scope: Deactivated successfully.
Jan 26 04:44:12 np0005595445 systemd[1]: session-34.scope: Consumed 17.799s CPU time.
Jan 26 04:44:12 np0005595445 systemd-logind[783]: Session 34 logged out. Waiting for processes to exit.
Jan 26 04:44:12 np0005595445 systemd-logind[783]: Removed session 34.
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setuser ceph since I am not root
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: ignoring --setgroup ceph since I am not root
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: pidfile_write: ignore empty --pid-file
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'alerts'
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:12.237+0000 7f00d41c2140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'balancer'
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:12.314+0000 7f00d41c2140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 26 04:44:12 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'cephadm'
Jan 26 04:44:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 26 04:44:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:12.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 87 pg[9.1a( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=87) [1]/[0] r=-1 lpr=87 pi=[63,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 26 04:44:12 np0005595445 ceph-mon[80107]: from='mgr.14454 192.168.122.100:0/1534630975' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:12.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1d scrub starts
Jan 26 04:44:12 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 10.1d scrub ok
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'crash'
Jan 26 04:44:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:13.201+0000 7f00d41c2140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'dashboard'
Jan 26 04:44:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'devicehealth'
Jan 26 04:44:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:13.844+0000 7f00d41c2140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 26 04:44:13 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'diskprediction_local'
Jan 26 04:44:13 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1b scrub starts
Jan 26 04:44:13 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1b scrub ok
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]:  from numpy import show_config as show_numpy_config
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.018+0000 7f00d41c2140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'influx'
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.085+0000 7f00d41c2140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'insights'
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'iostat'
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:14.229+0000 7f00d41c2140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'k8sevents'
Jan 26 04:44:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:14.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:14 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 89 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'localpool'
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mds_autoscaler'
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:14.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:14 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'mirroring'
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.16 scrub starts
Jan 26 04:44:14 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.16 scrub ok
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'nfs'
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.281+0000 7f00d41c2140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'orchestrator'
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.511+0000 7f00d41c2140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_perf_query'
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.589+0000 7f00d41c2140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'osd_support'
Jan 26 04:44:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.661+0000 7f00d41c2140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'pg_autoscaler'
Jan 26 04:44:15 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 90 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=5 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:15 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 90 pg[9.a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=9 ec=63/48 lis/c=87/63 les/c/f=88/65/0 sis=89) [1] r=0 lpr=89 pi=[63,89)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.756+0000 7f00d41c2140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'progress'
Jan 26 04:44:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:15.837+0000 7f00d41c2140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 26 04:44:15 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'prometheus'
Jan 26 04:44:15 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.14 scrub starts
Jan 26 04:44:15 np0005595445 systemd-logind[783]: New session 36 of user zuul.
Jan 26 04:44:15 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.14 scrub ok
Jan 26 04:44:15 np0005595445 systemd[1]: Started Session 36 of User zuul.
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.207+0000 7f00d41c2140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rbd_support'
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.311+0000 7f00d41c2140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'restful'
Jan 26 04:44:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:16.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rgw'
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1480003490 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:16.761+0000 7f00d41c2140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 26 04:44:16 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'rook'
Jan 26 04:44:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:16.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:16 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1 scrub starts
Jan 26 04:44:16 np0005595445 python3.9[87220]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:44:16 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 12.1 scrub ok
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.368+0000 7f00d41c2140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'selftest'
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.442+0000 7f00d41c2140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'snap_schedule'
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.529+0000 7f00d41c2140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'stats'
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'status'
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.711+0000 7f00d41c2140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telegraf'
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.796+0000 7f00d41c2140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'telemetry'
Jan 26 04:44:17 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 26 04:44:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:17.964+0000 7f00d41c2140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 26 04:44:17 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'test_orchestrator'
Jan 26 04:44:17 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.203+0000 7f00d41c2140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'volumes'
Jan 26 04:44:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:18.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.518+0000 7f00d41c2140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr[py] Loading python module 'zabbix'
Jan 26 04:44:18 np0005595445 python3.9[87435]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 2026-01-26T09:44:18.611+0000 7f00d41c2140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr load Constructed class from module: dashboard
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: mgr load Constructed class from module: prometheus
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: ms_deliver_dispatch: unhandled message 0x55eef89e7860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO root] Starting engine...
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Bus STARTING
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Bus STARTING
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: CherryPy Checker:
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: The Application mounted at '' has an empty config.
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: 
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Starting engine...
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [dashboard INFO root] Engine started...
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Serving on http://:::9283
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Serving on http://:::9283
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-mgr-compute-1-xammti[80412]: [26/Jan/2026:09:44:18] ENGINE Bus STARTED
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO cherrypy.error] [26/Jan/2026:09:44:18] ENGINE Bus STARTED
Jan 26 04:44:18 np0005595445 ceph-mgr[80416]: [prometheus INFO root] Engine started.
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:18.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 15 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:19 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1e deep-scrub starts
Jan 26 04:44:19 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1e deep-scrub ok
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: Active manager daemon compute-0.zllcia restarted
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: Activating manager daemon compute-0.zllcia
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: Manager daemon compute-0.zllcia is now available
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/mirror_snapshot_schedule"}]: dispatch
Jan 26 04:44:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.zllcia/trash_purge_schedule"}]: dispatch
Jan 26 04:44:19 np0005595445 systemd-logind[783]: New session 37 of user ceph-admin.
Jan 26 04:44:19 np0005595445 systemd[1]: Started Session 37 of User ceph-admin.
Jan 26 04:44:19 np0005595445 podman[87599]: 2026-01-26 09:44:19.897302696 +0000 UTC m=+0.054763931 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:44:19 np0005595445 podman[87599]: 2026-01-26 09:44:19.98710402 +0000 UTC m=+0.144565255 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:44:20 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 26 04:44:20 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 26 04:44:20 np0005595445 podman[87733]: 2026-01-26 09:44:20.494912924 +0000 UTC m=+0.068003165 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:44:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:20.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:20 np0005595445 podman[87733]: 2026-01-26 09:44:20.532123655 +0000 UTC m=+0.105213886 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:44:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:20 np0005595445 podman[87806]: 2026-01-26 09:44:20.736082599 +0000 UTC m=+0.045009035 container exec 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 04:44:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:20 np0005595445 podman[87806]: 2026-01-26 09:44:20.750011273 +0000 UTC m=+0.058937689 container exec_died 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 26 04:44:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:20.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:20 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:20 np0005595445 podman[87873]: 2026-01-26 09:44:20.944644604 +0000 UTC m=+0.057217025 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:44:20 np0005595445 podman[87873]: 2026-01-26 09:44:20.954058409 +0000 UTC m=+0.066630800 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:44:20 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 26 04:44:21 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:44:19] ENGINE Bus STARTING
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:44:19] ENGINE Serving on http://192.168.122.100:8765
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Serving on https://192.168.122.100:7150
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Bus STARTED
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: [26/Jan/2026:09:44:20] ENGINE Client ('192.168.122.100', 36132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 26 04:44:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 04:44:21 np0005595445 podman[87939]: 2026-01-26 09:44:21.14526337 +0000 UTC m=+0.053084067 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, distribution-scope=public, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vcs-type=git, io.buildah.version=1.28.2)
Jan 26 04:44:21 np0005595445 podman[87939]: 2026-01-26 09:44:21.170309463 +0000 UTC m=+0.078130140 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=1793, architecture=x86_64, vcs-type=git, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4)
Jan 26 04:44:21 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 26 04:44:21 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 26 04:44:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:22.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:22.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:22 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 26 04:44:22 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 26 04:44:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 26 04:44:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:23 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Jan 26 04:44:23 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Jan 26 04:44:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:24.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:24.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:24 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:24 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 26 04:44:25 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 26 04:44:25 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 26 04:44:25 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 26 04:44:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:26.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:26 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a40092a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.conf
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.conf
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.conf
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 26 04:44:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 26 04:44:27 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 26 04:44:27 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 26 04:44:27 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.f scrub starts
Jan 26 04:44:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.f scrub ok
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.conf
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 26 04:44:28 np0005595445 ceph-mon[80107]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 26 04:44:28 np0005595445 systemd[1]: session-36.scope: Deactivated successfully.
Jan 26 04:44:28 np0005595445 systemd[1]: session-36.scope: Consumed 8.048s CPU time.
Jan 26 04:44:28 np0005595445 systemd-logind[783]: Session 36 logged out. Waiting for processes to exit.
Jan 26 04:44:28 np0005595445 systemd-logind[783]: Removed session 36.
Jan 26 04:44:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:28.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:28.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:28 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.8 deep-scrub starts
Jan 26 04:44:28 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.8 deep-scrub ok
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: Updating compute-1:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: Updating compute-0:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: Updating compute-2:/var/lib/ceph/1a70b85d-e3fd-5814-8a6a-37ea00fcae30/config/ceph.client.admin.keyring
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:44:29 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 26 04:44:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 95 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=95) [1] r=0 lpr=95 pi=[79,95)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 95 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=95) [1] r=0 lpr=95 pi=[78,95)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:29 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Jan 26 04:44:30 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 26 04:44:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:30.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[79,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[79,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 96 pg[9.1d( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=96) [1]/[2] r=-1 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:30.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:30 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Jan 26 04:44:30 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Jan 26 04:44:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 26 04:44:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 26 04:44:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094431 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:44:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 26 04:44:31 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Jan 26 04:44:31 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Jan 26 04:44:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:32.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:32 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=98) [1] r=0 lpr=98 pi=[75,98)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=98) [1] r=0 lpr=98 pi=[75,98)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 98 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Jan 26 04:44:32 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Jan 26 04:44:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 26 04:44:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 26 04:44:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1f( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=75/75 les/c/f=76/76/0 sis=99) [1]/[2] r=-1 lpr=99 pi=[75,99)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=6 ec=63/48 lis/c=96/79 les/c/f=97/80/0 sis=98) [1] r=0 lpr=98 pi=[79,98)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:33 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 99 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=96/78 les/c/f=97/79/0 sis=98) [1] r=0 lpr=98 pi=[78,98)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:34.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 26 04:44:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:34.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:34 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:34 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Jan 26 04:44:34 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Jan 26 04:44:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 26 04:44:34 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 100 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=100) [1] r=0 lpr=100 pi=[63,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:35 np0005595445 ceph-mon[80107]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Jan 26 04:44:35 np0005595445 ceph-mon[80107]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Jan 26 04:44:35 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 26 04:44:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[63,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=101) [1]/[0] r=-1 lpr=101 pi=[63,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:35 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 101 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:36.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:36 np0005595445 ceph-mon[80107]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 26 04:44:36 np0005595445 ceph-mon[80107]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 26 04:44:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:36.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:36 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:36 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 26 04:44:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 26 04:44:36 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 26 04:44:36 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 102 pg[9.f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=6 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:36 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 102 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=99/75 les/c/f=100/76/0 sis=101) [1] r=0 lpr=101 pi=[75,101)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 26 04:44:37 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 103 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:37 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 103 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:37 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Jan 26 04:44:37 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Jan 26 04:44:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:38.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 26 04:44:38 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 104 pg[9.10( v 60'1159 (0'0,60'1159] local-lis/les=103/104 n=2 ec=63/48 lis/c=101/63 les/c/f=102/65/0 sis=103) [1] r=0 lpr=103 pi=[63,103)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:38 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:38 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:38 np0005595445 ceph-mon[80107]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 26 04:44:38 np0005595445 ceph-mon[80107]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 26 04:44:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:38 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:38 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Jan 26 04:44:38 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Jan 26 04:44:39 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 26 04:44:39 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 26 04:44:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:40.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488002550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:40.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:40 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:44:40 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 26 04:44:40 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 26 04:44:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 26 04:44:40 np0005595445 ceph-osd[77632]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 26 04:44:41 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 105 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=105) [1] r=0 lpr=105 pi=[63,105)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.456821987 +0000 UTC m=+0.036400771 container create 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:44:41 np0005595445 systemd[84066]: Starting Mark boot as successful...
Jan 26 04:44:41 np0005595445 systemd[1]: Started libpod-conmon-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope.
Jan 26 04:44:41 np0005595445 systemd[84066]: Finished Mark boot as successful.
Jan 26 04:44:41 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.50945272 +0000 UTC m=+0.089031494 container init 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.515465888 +0000 UTC m=+0.095044672 container start 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True)
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.518180478 +0000 UTC m=+0.097759262 container attach 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 04:44:41 np0005595445 unruffled_chatterjee[89201]: 167 167
Jan 26 04:44:41 np0005595445 systemd[1]: libpod-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope: Deactivated successfully.
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.520984802 +0000 UTC m=+0.100563576 container died 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.440808198 +0000 UTC m=+0.020386992 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:44:41 np0005595445 systemd[1]: var-lib-containers-storage-overlay-d11027f73f59693698317623e656a37bfa891ba8c1611aa47bed5adef81e33fe-merged.mount: Deactivated successfully.
Jan 26 04:44:41 np0005595445 podman[89183]: 2026-01-26 09:44:41.552020612 +0000 UTC m=+0.131599396 container remove 0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_chatterjee, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 04:44:41 np0005595445 systemd[1]: libpod-conmon-0b2fbc6cc3f055852090c599c19075968ea058f0b3a72a077f076777f3faf255.scope: Deactivated successfully.
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: Reconfiguring rgw.rgw.compute-1.fbcidm (unknown last config time)...
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.fbcidm", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: Reconfiguring daemon rgw.rgw.compute-1.fbcidm on compute-1
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 26 04:44:42 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 106 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=106) [1]/[0] r=-1 lpr=106 pi=[63,106)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:42 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 106 pg[9.11( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=106) [1]/[0] r=-1 lpr=106 pi=[63,106)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:42.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:42.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:42 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:43 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 26 04:44:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 26 04:44:43 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 107 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=107) [1] r=0 lpr=107 pi=[63,107)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:44:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:44:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:43 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:44:44 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 26 04:44:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 26 04:44:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=108) [1]/[0] r=-1 lpr=108 pi=[63,108)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:44 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 108 pg[9.12( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=63/63 les/c/f=65/65/0 sis=108) [1]/[0] r=-1 lpr=108 pi=[63,108)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:44 np0005595445 systemd-logind[783]: New session 38 of user zuul.
Jan 26 04:44:44 np0005595445 systemd[1]: Started Session 38 of User zuul.
Jan 26 04:44:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:44.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:44:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:44.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:44:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:44 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:45 np0005595445 python3.9[89372]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 04:44:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 26 04:44:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 26 04:44:45 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 109 pg[9.11( v 60'1159 (0'0,60'1159] local-lis/les=108/109 n=5 ec=63/48 lis/c=106/63 les/c/f=107/65/0 sis=108) [1] r=0 lpr=108 pi=[63,108)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:46 np0005595445 python3.9[89547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:44:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:46.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:44:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 26 04:44:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:46 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 110 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:46 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 110 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:46.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:46 : epoch 69773726 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:44:47 np0005595445 python3.9[89703]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:44:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 26 04:44:47 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 111 pg[9.12( v 60'1159 (0'0,60'1159] local-lis/les=110/111 n=4 ec=63/48 lis/c=108/63 les/c/f=109/65/0 sis=110) [1] r=0 lpr=110 pi=[63,110)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:48.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:48 np0005595445 python3.9[89857]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:44:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003780 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:48.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:48 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:49 np0005595445 python3.9[90012]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:44:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:50.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:50 np0005595445 python3.9[90164]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:44:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:50.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:50 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 26 04:44:50 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 26 04:44:50 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:51 np0005595445 python3.9[90339]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:44:51 np0005595445 network[90356]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:44:51 np0005595445 network[90357]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:44:51 np0005595445 network[90358]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:44:51 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 26 04:44:51 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:44:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:52.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:52.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:52 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 26 04:44:52 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 113 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=113) [1] r=0 lpr=113 pi=[78,113)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:52 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 26 04:44:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094453 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:44:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 26 04:44:53 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 114 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=114) [1]/[2] r=-1 lpr=114 pi=[78,114)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:53 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 114 pg[9.15( empty local-lis/les=0/0 n=0 ec=63/48 lis/c=78/78 les/c/f=79/79/0 sis=114) [1]/[2] r=-1 lpr=114 pi=[78,114)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 26 04:44:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:54.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:54.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:54 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:54 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 26 04:44:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 26 04:44:54 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 115 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=13.068075180s) [2] r=-1 lpr=115 pi=[79,115)/1 crt=60'1159 mlcod 0'0 active pruub 286.115386963s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:54 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 115 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=115 pruub=13.068030357s) [2] r=-1 lpr=115 pi=[79,115)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 286.115386963s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:44:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 26 04:44:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 26 04:44:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 luod=0'0 crt=60'1159 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=0/0 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 crt=60'1159 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:55 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 116 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 04:44:56 np0005595445 python3.9[90648]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:44:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:56.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14940037e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:56.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:56 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 26 04:44:56 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 117 pg[9.15( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=114/78 les/c/f=115/79/0 sis=116) [1] r=0 lpr=116 pi=[78,116)/1 crt=60'1159 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:57 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 117 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=116) [2]/[1] async=[2] r=0 lpr=116 pi=[79,116)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:44:57 np0005595445 python3.9[90798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:44:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 26 04:44:57 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 118 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.382803917s) [2] async=[2] r=-1 lpr=118 pi=[79,118)/1 crt=60'1159 mlcod 60'1159 active pruub 291.080383301s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:44:57 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 118 pg[9.16( v 60'1159 (0'0,60'1159] local-lis/les=116/117 n=4 ec=63/48 lis/c=116/79 les/c/f=117/80/0 sis=118 pruub=15.382746696s) [2] r=-1 lpr=118 pi=[79,118)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 291.080383301s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:44:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:44:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:44:58.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:44:58 np0005595445 python3.9[90955]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:44:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 26 04:44:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:44:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:44:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:44:58.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:44:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:44:58 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1494003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:44:59 np0005595445 python3.9[91114]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:45:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:00.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:00 np0005595445 python3.9[91198]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:45:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:45:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 26 04:45:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:45:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:00.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:45:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:00 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 26 04:45:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 26 04:45:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:02.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:02 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:02.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 26 04:45:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 26 04:45:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 26 04:45:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:04.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1488003880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:04 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:04.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:04 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 26 04:45:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 26 04:45:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 26 04:45:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 26 04:45:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 26 04:45:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:06.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4001320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:06 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:45:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:06.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:45:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 26 04:45:07 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 26 04:45:07 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 124 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=124 pruub=12.692170143s) [0] r=-1 lpr=124 pi=[89,124)/1 crt=60'1159 mlcod 0'0 active pruub 297.793975830s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:07 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 124 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=124 pruub=12.692132950s) [0] r=-1 lpr=124 pi=[89,124)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 297.793975830s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 26 04:45:07 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 125 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:07 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 125 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=89/90 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 04:45:08 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 26 04:45:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 26 04:45:08 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 126 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=89/89 les/c/f=90/90/0 sis=125) [0]/[1] async=[0] r=0 lpr=125 pi=[89,125)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:45:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:08 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:45:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:08.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:45:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 26 04:45:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 26 04:45:09 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 26 04:45:09 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 127 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=125/89 les/c/f=126/90/0 sis=127 pruub=14.990523338s) [0] async=[0] r=-1 lpr=127 pi=[89,127)/1 crt=60'1159 mlcod 60'1159 active pruub 302.718963623s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:09 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 127 pg[9.1a( v 60'1159 (0'0,60'1159] local-lis/les=125/126 n=4 ec=63/48 lis/c=125/89 les/c/f=126/90/0 sis=127 pruub=14.990057945s) [0] r=-1 lpr=127 pi=[89,127)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 302.718963623s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 26 04:45:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478001f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:10 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:10.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 26 04:45:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:12.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:12 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14780020f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:12.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 26 04:45:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:14 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:14.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:15 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:16.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 26 04:45:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 26 04:45:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:16 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14a4002300 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:16.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:17 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 26 04:45:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:18.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 26 04:45:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 26 04:45:18 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 132 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=132 pruub=11.250358582s) [2] r=-1 lpr=132 pi=[98,132)/1 crt=60'1159 mlcod 0'0 active pruub 308.035430908s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:18 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 132 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=132 pruub=11.250319481s) [2] r=-1 lpr=132 pi=[98,132)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 308.035430908s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14880041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1478003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:18 np0005595445 kernel: ganesha.nfsd[87014]: segfault at 50 ip 00007f152f46c32e sp 00007f14b5ffa210 error 4 in libntirpc.so.5.8[7f152f451000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 26 04:45:18 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:45:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[86200]: 26/01/2026 09:45:18 : epoch 69773726 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f14800041c0 fd 48 proxy ignored for local
Jan 26 04:45:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:18.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:18 np0005595445 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 26 04:45:18 np0005595445 systemd[1]: Started Process Core Dump (PID 91352/UID 0).
Jan 26 04:45:19 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 26 04:45:19 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 133 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:19 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 133 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=98/99 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 04:45:19 np0005595445 systemd-coredump[91353]: Process 86204 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007f152f46c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:45:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 26 04:45:19 np0005595445 systemd[1]: systemd-coredump@0-91352-0.service: Deactivated successfully.
Jan 26 04:45:19 np0005595445 systemd[1]: systemd-coredump@0-91352-0.service: Consumed 1.002s CPU time.
Jan 26 04:45:19 np0005595445 podman[91359]: 2026-01-26 09:45:19.975237784 +0000 UTC m=+0.025710335 container died 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 26 04:45:19 np0005595445 systemd[1]: var-lib-containers-storage-overlay-31ac196e154da090dbbc83357691609c1e8abfb8268037421008e730e1429aee-merged.mount: Deactivated successfully.
Jan 26 04:45:20 np0005595445 podman[91359]: 2026-01-26 09:45:20.011432283 +0000 UTC m=+0.061904814 container remove 19c026de89e744ddc79168cf6d35f36da879cfc6e36f4542a44b0a0f6b664c36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:45:20 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:45:20 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:45:20 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.483s CPU time.
Jan 26 04:45:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:20.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:20 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 26 04:45:20 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 134 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=98/98 les/c/f=99/99/0 sis=133) [2]/[1] async=[2] r=0 lpr=133 pi=[98,133)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:45:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:20.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 26 04:45:21 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 135 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=133/98 les/c/f=134/99/0 sis=135 pruub=14.956477165s) [2] async=[2] r=-1 lpr=135 pi=[98,135)/1 crt=60'1159 mlcod 60'1159 active pruub 314.923736572s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:21 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 135 pg[9.1d( v 60'1159 (0'0,60'1159] local-lis/les=133/134 n=5 ec=63/48 lis/c=133/98 les/c/f=134/99/0 sis=135 pruub=14.956420898s) [2] r=-1 lpr=135 pi=[98,135)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 314.923736572s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:22.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.946405) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722946448, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3213, "num_deletes": 251, "total_data_size": 9968564, "memory_usage": 10224976, "flush_reason": "Manual Compaction"}
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722982459, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6284817, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7369, "largest_seqno": 10577, "table_properties": {"data_size": 6270463, "index_size": 9248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4037, "raw_key_size": 36445, "raw_average_key_size": 22, "raw_value_size": 6239048, "raw_average_value_size": 3914, "num_data_blocks": 402, "num_entries": 1594, "num_filter_entries": 1594, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420622, "oldest_key_time": 1769420622, "file_creation_time": 1769420722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 36102 microseconds, and 10429 cpu microseconds.
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.982508) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6284817 bytes OK
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.982531) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985011) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985050) EVENT_LOG_v1 {"time_micros": 1769420722985045, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.985077) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9952950, prev total WAL file size 9952950, number of live WAL files 2.
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.987358) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6137KB)], [18(10MB)]
Jan 26 04:45:22 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420722987441, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17228625, "oldest_snapshot_seqno": -1}
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4056 keys, 14807031 bytes, temperature: kUnknown
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723076571, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14807031, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14774547, "index_size": 21238, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10181, "raw_key_size": 103546, "raw_average_key_size": 25, "raw_value_size": 14694884, "raw_average_value_size": 3622, "num_data_blocks": 914, "num_entries": 4056, "num_filter_entries": 4056, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.076844) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14807031 bytes
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.089511) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.1 rd, 166.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.0, 10.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.1) write-amplify(2.4) OK, records in: 4594, records dropped: 538 output_compression: NoCompression
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.089555) EVENT_LOG_v1 {"time_micros": 1769420723089537, "job": 8, "event": "compaction_finished", "compaction_time_micros": 89207, "compaction_time_cpu_micros": 32678, "output_level": 6, "num_output_files": 1, "total_output_size": 14807031, "num_input_records": 4594, "num_output_records": 4056, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723091379, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420723094079, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:22.987182) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094209) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:45:23.094211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:45:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:24.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094524 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:45:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:24.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:26.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:26.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:27 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 26 04:45:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 26 04:45:27 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 137 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=137 pruub=13.013389587s) [0] r=-1 lpr=137 pi=[79,137)/1 crt=60'1159 mlcod 0'0 active pruub 318.115966797s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:27 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 137 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=137 pruub=13.013346672s) [0] r=-1 lpr=137 pi=[79,137)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 318.115966797s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 26 04:45:27 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 138 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:27 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 138 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=79/80 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 04:45:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 26 04:45:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:28.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 26 04:45:28 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=139 pruub=12.346158981s) [0] r=-1 lpr=139 pi=[101,139)/1 crt=60'1159 mlcod 0'0 active pruub 319.068176270s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:28 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=139 pruub=12.346119881s) [0] r=-1 lpr=139 pi=[101,139)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 319.068176270s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:28 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 139 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=79/79 les/c/f=80/80/0 sis=138) [0]/[1] async=[0] r=0 lpr=138 pi=[79,138)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:45:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 26 04:45:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 26 04:45:29 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 26 04:45:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=138/79 les/c/f=139/80/0 sis=140 pruub=15.001749039s) [0] async=[0] r=-1 lpr=140 pi=[79,140)/1 crt=60'1159 mlcod 60'1159 active pruub 322.724273682s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=101/102 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 26 04:45:29 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 140 pg[9.1e( v 60'1159 (0'0,60'1159] local-lis/les=138/139 n=5 ec=63/48 lis/c=138/79 les/c/f=139/80/0 sis=140 pruub=15.001703262s) [0] r=-1 lpr=140 pi=[79,140)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 322.724273682s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:30 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 1.
Jan 26 04:45:30 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:45:30 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.483s CPU time.
Jan 26 04:45:30 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:45:30 np0005595445 podman[91485]: 2026-01-26 09:45:30.394046351 +0000 UTC m=+0.021084965 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:45:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:30.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:30.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:31 np0005595445 podman[91485]: 2026-01-26 09:45:31.192942454 +0000 UTC m=+0.819981038 container create f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 26 04:45:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 26 04:45:31 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 141 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=101/101 les/c/f=102/102/0 sis=140) [0]/[1] async=[0] r=0 lpr=140 pi=[101,140)/1 crt=60'1159 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 26 04:45:31 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:45:31 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:45:31 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:45:31 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:45:31 np0005595445 podman[91485]: 2026-01-26 09:45:31.396696531 +0000 UTC m=+1.023735145 container init f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Jan 26 04:45:31 np0005595445 podman[91485]: 2026-01-26 09:45:31.401690311 +0000 UTC m=+1.028728905 container start f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:45:31 np0005595445 bash[91485]: f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f
Jan 26 04:45:31 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:45:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:31 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:45:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 26 04:45:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 142 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=140/101 les/c/f=141/102/0 sis=142 pruub=14.966416359s) [0] async=[0] r=-1 lpr=142 pi=[101,142)/1 crt=60'1159 mlcod 60'1159 active pruub 325.436614990s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 26 04:45:32 np0005595445 ceph-osd[77632]: osd.1 pg_epoch: 142 pg[9.1f( v 60'1159 (0'0,60'1159] local-lis/les=140/141 n=5 ec=63/48 lis/c=140/101 les/c/f=141/102/0 sis=142 pruub=14.966349602s) [0] r=-1 lpr=142 pi=[101,142)/1 crt=60'1159 mlcod 0'0 unknown NOTIFY pruub 325.436614990s@ mbc={}] state<Start>: transitioning to Stray
Jan 26 04:45:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:32.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:32.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 26 04:45:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:34.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:34.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:36.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:45:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:36.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:45:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:37 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:45:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:37 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:45:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:38.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:38.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:40.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:45:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:45:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:42.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:45:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:43 : epoch 697737bb : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:45:44 np0005595445 python3.9[91742]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:45:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:44.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:44.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:46 np0005595445 python3.9[92034]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 04:45:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:45:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:46.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:45:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094546 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:45:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:47 np0005595445 python3.9[92186]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 04:45:47 np0005595445 python3.9[92339]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:45:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:48.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:48 np0005595445 python3.9[92491]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 04:45:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:48.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:50 np0005595445 python3.9[92644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:45:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:50.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:50 np0005595445 python3.9[92796]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:45:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:45:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:50.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:45:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:51 np0005595445 python3.9[92874]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:45:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:52.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:52 np0005595445 python3.9[93110]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:45:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:54 np0005595445 python3.9[93265]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 04:45:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:54.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:54 np0005595445 python3.9[93418]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 04:45:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:54.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:45:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:45:56 np0005595445 python3.9[93597]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 04:45:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:45:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:45:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:56.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:45:56 np0005595445 python3.9[93751]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 04:45:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:56.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:57 np0005595445 python3.9[93904]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:45:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:45:58.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:45:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:45:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:45:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:45:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:45:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:45:58.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:00 np0005595445 python3.9[94058]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:46:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:00.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:00 np0005595445 python3.9[94235]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:46:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:46:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:00.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:46:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:01 np0005595445 python3.9[94313]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:46:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:46:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:46:02 np0005595445 python3.9[94466]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:46:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:02.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:02 np0005595445 python3.9[94544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:46:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b900032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:02.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:04 np0005595445 python3.9[94697]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:46:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:46:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:46:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094606 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:46:06 np0005595445 python3.9[94849]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:46:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:06.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:06.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:07 np0005595445 python3.9[95001]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 04:46:07 np0005595445 python3.9[95152]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:46:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:08.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:08.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:09 np0005595445 python3.9[95304]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:46:09 np0005595445 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 04:46:09 np0005595445 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 04:46:09 np0005595445 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 04:46:09 np0005595445 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 04:46:09 np0005595445 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 04:46:10 np0005595445 python3.9[95466]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 04:46:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:10.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:46:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:10.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:46:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:12.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:14 np0005595445 python3.9[95620]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:46:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:46:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:14.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:46:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b8c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:46:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:46:15 np0005595445 python3.9[95774]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:46:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:15 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:46:15 np0005595445 systemd[1]: session-38.scope: Deactivated successfully.
Jan 26 04:46:15 np0005595445 systemd[1]: session-38.scope: Consumed 1min 3.016s CPU time.
Jan 26 04:46:15 np0005595445 systemd-logind[783]: Session 38 logged out. Waiting for processes to exit.
Jan 26 04:46:15 np0005595445 systemd-logind[783]: Removed session 38.
Jan 26 04:46:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:16.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:16.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:46:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:46:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:18.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:18.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:20.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:20.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:21 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:46:21 np0005595445 systemd-logind[783]: New session 39 of user zuul.
Jan 26 04:46:21 np0005595445 systemd[1]: Started Session 39 of User zuul.
Jan 26 04:46:22 np0005595445 python3.9[95987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:46:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:46:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:22.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:46:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:22.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:23 np0005595445 python3.9[96144]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 04:46:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:24.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:24.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:25 np0005595445 python3.9[96299]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:46:25 np0005595445 python3.9[96384]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 04:46:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094626 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:46:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:26.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:26.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:28 np0005595445 python3.9[96538]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:46:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:28.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:28.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:46:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:30.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:46:30 np0005595445 python3.9[96694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:46:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:30 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:30.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:31 np0005595445 python3.9[96848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:46:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:46:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:32.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:46:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:32 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:32.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:33 np0005595445 python3.9[97000]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 04:46:34 np0005595445 python3.9[97151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:46:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:34.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:34 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:34.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:35 np0005595445 python3.9[97309]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:46:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:36.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:36 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:36.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:37 np0005595445 python3.9[97489]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:46:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:38.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:38 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:38.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:39 np0005595445 python3.9[97776]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 04:46:40 np0005595445 python3.9[97927]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:46:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:40.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:40 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:40.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:41 np0005595445 python3.9[98081]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:46:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:42.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:42 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:42.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:44 np0005595445 python3.9[98236]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:46:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:44.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:44 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:44.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:46.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:46 np0005595445 python3.9[98390]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:46:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:46 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:46.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:47 np0005595445 python3.9[98545]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 26 04:46:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:46:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:48.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:46:48 np0005595445 systemd[1]: session-39.scope: Deactivated successfully.
Jan 26 04:46:48 np0005595445 systemd[1]: session-39.scope: Consumed 18.227s CPU time.
Jan 26 04:46:48 np0005595445 systemd-logind[783]: Session 39 logged out. Waiting for processes to exit.
Jan 26 04:46:48 np0005595445 systemd-logind[783]: Removed session 39.
Jan 26 04:46:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c0036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:48 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:48.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:50.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:50 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:50.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:52.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:52 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:52.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:53 np0005595445 systemd-logind[783]: New session 40 of user zuul.
Jan 26 04:46:53 np0005595445 systemd[1]: Started Session 40 of User zuul.
Jan 26 04:46:54 np0005595445 python3.9[98731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:46:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:54.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0002000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:54 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:46:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:54.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:46:55 np0005595445 python3.9[98885]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:46:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:46:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:56.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:56 np0005595445 python3.9[99104]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:46:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:56 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:57.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:57 np0005595445 systemd[1]: session-40.scope: Deactivated successfully.
Jan 26 04:46:57 np0005595445 systemd[1]: session-40.scope: Consumed 2.288s CPU time.
Jan 26 04:46:57 np0005595445 systemd-logind[783]: Session 40 logged out. Waiting for processes to exit.
Jan 26 04:46:57 np0005595445 systemd-logind[783]: Removed session 40.
Jan 26 04:46:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:46:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:46:58.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:46:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b90004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:46:58 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:46:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:46:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:46:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:46:59.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:47:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:00.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:00 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:01.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:02.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:02 np0005595445 systemd-logind[783]: New session 41 of user zuul.
Jan 26 04:47:02 np0005595445 systemd[1]: Started Session 41 of User zuul.
Jan 26 04:47:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:02 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:03.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:03 np0005595445 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 04:47:03 np0005595445 systemd[1]: session-19.scope: Consumed 9.012s CPU time.
Jan 26 04:47:03 np0005595445 systemd-logind[783]: Session 19 logged out. Waiting for processes to exit.
Jan 26 04:47:03 np0005595445 systemd-logind[783]: Removed session 19.
Jan 26 04:47:03 np0005595445 python3.9[99369]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:47:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:47:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:04.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:47:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:04 np0005595445 python3.9[99523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:47:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:04 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:05.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:05 np0005595445 python3.9[99680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:47:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:47:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:06.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:47:06 np0005595445 python3.9[99764]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:47:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba8008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:06 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:07.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:47:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:08.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:47:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:08 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:09.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:09 np0005595445 python3.9[99920]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:47:10 np0005595445 python3.9[100116]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:10.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:10 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:11.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:11 np0005595445 python3.9[100293]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:47:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:47:12 np0005595445 python3.9[100459]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:47:12 np0005595445 python3.9[100537]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:12.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:12 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:13.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:13 np0005595445 python3.9[100689]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:47:13 np0005595445 python3.9[100768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:47:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094714 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:47:14 np0005595445 python3.9[100920]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:47:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840046d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:14 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:15.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:15 np0005595445 python3.9[101072]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:47:15 np0005595445 python3.9[101250]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:47:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:16 np0005595445 python3.9[101402]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:47:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:16.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:16 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b840046f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:17.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:17 np0005595445 python3.9[101557]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:47:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:18.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba80096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:18 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:19.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:20 np0005595445 python3.9[101711]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:47:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:20 np0005595445 python3.9[101865]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:47:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:20 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:21.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:21 np0005595445 python3.9[102018]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:47:22 np0005595445 python3.9[102171]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:47:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:22.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b84004cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:22 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:23.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:23 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:47:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094723 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:47:23 np0005595445 python3.9[102326]: ansible-service_facts Invoked
Jan 26 04:47:23 np0005595445 network[102343]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:47:23 np0005595445 network[102344]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:47:23 np0005595445 network[102345]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:47:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:24.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:24 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c001000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:47:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:47:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:26.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:26 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b80003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:28.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4b7c001000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba0003500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[91500]: 26/01/2026 09:47:28 : epoch 697737bb : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4ba800a7e0 fd 39 proxy ignored for local
Jan 26 04:47:28 np0005595445 kernel: ganesha.nfsd[98575]: segfault at 50 ip 00007f4c3167c32e sp 00007f4b9effc210 error 4 in libntirpc.so.5.8[7f4c31661000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 26 04:47:28 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:47:28 np0005595445 systemd[1]: Started Process Core Dump (PID 102487/UID 0).
Jan 26 04:47:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:29.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:30 np0005595445 systemd-coredump[102488]: Process 91504 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 61:#012#0  0x00007f4c3167c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f4c31686900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 26 04:47:30 np0005595445 systemd[1]: systemd-coredump@1-102487-0.service: Deactivated successfully.
Jan 26 04:47:30 np0005595445 systemd[1]: systemd-coredump@1-102487-0.service: Consumed 1.084s CPU time.
Jan 26 04:47:30 np0005595445 podman[102495]: 2026-01-26 09:47:30.160536133 +0000 UTC m=+0.028951176 container died f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:47:30 np0005595445 systemd[1]: var-lib-containers-storage-overlay-acaa23a82ff129fe08b8ca585e5004a757ed19363666aee393d8ce5d14943f83-merged.mount: Deactivated successfully.
Jan 26 04:47:30 np0005595445 podman[102495]: 2026-01-26 09:47:30.198815995 +0000 UTC m=+0.067231068 container remove f32869ac6ab743e2b48608e16b6d5e4d055ccee739772d437fabcd66b742e07f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 26 04:47:30 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:47:30 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:47:30 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.482s CPU time.
Jan 26 04:47:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:30.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:32 np0005595445 python3.9[102858]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:47:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:32.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:33.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:34.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:34 np0005595445 python3.9[103014]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 04:47:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094734 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:47:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094734 (4) : haproxy version is 2.3.17-d1c9119
Jan 26 04:47:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [NOTICE] 025/094734 (4) : path to executable is /usr/local/sbin/haproxy
Jan 26 04:47:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [ALERT] 025/094734 (4) : backend 'backend' has no server available!
Jan 26 04:47:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:35.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:36 np0005595445 python3.9[103192]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:47:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:36.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:36 np0005595445 python3.9[103270]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:37.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:37 np0005595445 python3.9[103423]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:47:38 np0005595445 python3.9[103501]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094738 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:47:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:38.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:39.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:40 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 2.
Jan 26 04:47:40 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:47:40 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.482s CPU time.
Jan 26 04:47:40 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:47:40 np0005595445 podman[103700]: 2026-01-26 09:47:40.593768898 +0000 UTC m=+0.042531631 container create cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Jan 26 04:47:40 np0005595445 systemd[84066]: Created slice User Background Tasks Slice.
Jan 26 04:47:40 np0005595445 systemd[84066]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 04:47:40 np0005595445 systemd[84066]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 04:47:40 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:47:40 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:47:40 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:47:40 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:47:40 np0005595445 python3.9[103670]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:40 np0005595445 podman[103700]: 2026-01-26 09:47:40.572060521 +0000 UTC m=+0.020823264 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:47:40 np0005595445 podman[103700]: 2026-01-26 09:47:40.697642331 +0000 UTC m=+0.146405034 container init cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 26 04:47:40 np0005595445 podman[103700]: 2026-01-26 09:47:40.702203046 +0000 UTC m=+0.150965739 container start cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 04:47:40 np0005595445 bash[103700]: cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef
Jan 26 04:47:40 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:47:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:40.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:47:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:40 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:47:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:42 np0005595445 python3.9[103910]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:47:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:42.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:43 np0005595445 python3.9[103994]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:47:44 np0005595445 systemd[1]: session-41.scope: Deactivated successfully.
Jan 26 04:47:44 np0005595445 systemd[1]: session-41.scope: Consumed 23.866s CPU time.
Jan 26 04:47:44 np0005595445 systemd-logind[783]: Session 41 logged out. Waiting for processes to exit.
Jan 26 04:47:44 np0005595445 systemd-logind[783]: Removed session 41.
Jan 26 04:47:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:44.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:45.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094745 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:47:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:46.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 04:47:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 04:47:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:47:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:46 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:47:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:48.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:49.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:49 np0005595445 systemd-logind[783]: New session 42 of user zuul.
Jan 26 04:47:49 np0005595445 systemd[1]: Started Session 42 of User zuul.
Jan 26 04:47:50 np0005595445 python3.9[104180]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:50.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:51.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:51 np0005595445 python3.9[104332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:47:51 np0005595445 python3.9[104411]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:47:52 np0005595445 systemd[1]: session-42.scope: Deactivated successfully.
Jan 26 04:47:52 np0005595445 systemd[1]: session-42.scope: Consumed 1.528s CPU time.
Jan 26 04:47:52 np0005595445 systemd-logind[783]: Session 42 logged out. Waiting for processes to exit.
Jan 26 04:47:52 np0005595445 systemd-logind[783]: Removed session 42.
Jan 26 04:47:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:52.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000009:nfs.cephfs.0: -2
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:52 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:53.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:54.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094754 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:47:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:54 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:47:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:55.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:47:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:47:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:56.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:56 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe710001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:57 np0005595445 systemd-logind[783]: New session 43 of user zuul.
Jan 26 04:47:57 np0005595445 systemd[1]: Started Session 43 of User zuul.
Jan 26 04:47:58 np0005595445 python3.9[104635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:47:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:47:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:47:58.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:47:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:47:58 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:47:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:47:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:47:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:47:59.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:47:59 np0005595445 python3.9[104791]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:00 np0005595445 python3.9[104967]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:00.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:00 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:01.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:01 np0005595445 python3.9[105045]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.49xu_gz8 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:02 np0005595445 python3.9[105198]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:02 np0005595445 python3.9[105276]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.k66gfg_f recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:02 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:03.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:03 np0005595445 python3.9[105428]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:48:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094803 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:48:03 np0005595445 python3.9[105581]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:04 np0005595445 python3.9[105659]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:48:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:04.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:04 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:04 np0005595445 python3.9[105812]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:05.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:05 np0005595445 python3.9[105891]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:48:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:06 np0005595445 python3.9[106044]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:06.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:06 np0005595445 python3.9[106196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:06 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:07 np0005595445 python3.9[106274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.689261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887689308, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1884, "num_deletes": 250, "total_data_size": 5391717, "memory_usage": 5477968, "flush_reason": "Manual Compaction"}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887705108, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2272474, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10582, "largest_seqno": 12461, "table_properties": {"data_size": 2266628, "index_size": 2981, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14457, "raw_average_key_size": 20, "raw_value_size": 2253941, "raw_average_value_size": 3147, "num_data_blocks": 132, "num_entries": 716, "num_filter_entries": 716, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420723, "oldest_key_time": 1769420723, "file_creation_time": 1769420887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 15913 microseconds, and 5537 cpu microseconds.
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.705175) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2272474 bytes OK
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.705204) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706690) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706733) EVENT_LOG_v1 {"time_micros": 1769420887706728, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.706752) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5383367, prev total WAL file size 5383367, number of live WAL files 2.
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.708288) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2219KB)], [21(14MB)]
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887708560, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17079505, "oldest_snapshot_seqno": -1}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4327 keys, 15332897 bytes, temperature: kUnknown
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887805375, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15332897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15299746, "index_size": 21201, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10885, "raw_key_size": 109502, "raw_average_key_size": 25, "raw_value_size": 15216508, "raw_average_value_size": 3516, "num_data_blocks": 911, "num_entries": 4327, "num_filter_entries": 4327, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420887, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.805607) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15332897 bytes
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.809749) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.3 rd, 158.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 14.1 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(14.3) write-amplify(6.7) OK, records in: 4772, records dropped: 445 output_compression: NoCompression
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.809773) EVENT_LOG_v1 {"time_micros": 1769420887809764, "job": 10, "event": "compaction_finished", "compaction_time_micros": 96886, "compaction_time_cpu_micros": 29576, "output_level": 6, "num_output_files": 1, "total_output_size": 15332897, "num_input_records": 4772, "num_output_records": 4327, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887810336, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420887812867, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.708171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:07.812967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:08 np0005595445 python3.9[106427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:08 np0005595445 python3.9[106505]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:08.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:08 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:09.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:09 np0005595445 python3.9[106658]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:48:09 np0005595445 systemd[1]: Reloading.
Jan 26 04:48:09 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:48:09 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:48:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:10.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:10 np0005595445 python3.9[106848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:10 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:11.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:11 np0005595445 python3.9[106993]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:48:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:48:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:48:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:48:11 np0005595445 python3.9[107160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:12 np0005595445 python3.9[107238]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:12.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:48:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:12 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7100096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:13.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:13 np0005595445 python3.9[107390]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:48:13 np0005595445 systemd[1]: Reloading.
Jan 26 04:48:13 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:48:13 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:48:13 np0005595445 systemd[1]: Starting Create netns directory...
Jan 26 04:48:13 np0005595445 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 04:48:13 np0005595445 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 04:48:13 np0005595445 systemd[1]: Finished Create netns directory.
Jan 26 04:48:14 np0005595445 python3.9[107584]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:48:14 np0005595445 network[107601]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:48:14 np0005595445 network[107602]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:48:14 np0005595445 network[107603]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:48:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:14.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:14 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:15.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:15 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:48:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:15 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:48:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:16.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:16 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:17.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:17 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:48:17 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:48:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:18.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:48:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:18 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:19.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:20.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:20 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:21.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:22 np0005595445 python3.9[107919]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:22.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe71000a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:22 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e0003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:23 np0005595445 python3.9[107997]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:24 np0005595445 python3.9[108150]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:24 np0005595445 python3.9[108302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:24.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6fc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:24 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe704001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:25.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:25 np0005595445 python3.9[108383]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094825 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:48:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:26 np0005595445 python3.9[108536]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 04:48:26 np0005595445 systemd[1]: Starting Time & Date Service...
Jan 26 04:48:26 np0005595445 systemd[1]: Started Time & Date Service.
Jan 26 04:48:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:26.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:26 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6e4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:26 np0005595445 kernel: ganesha.nfsd[104453]: segfault at 50 ip 00007fe79115d32e sp 00007fe6f8ff8210 error 4 in libntirpc.so.5.8[7fe791142000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 26 04:48:26 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:48:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[103716]: 26/01/2026 09:48:26 : epoch 6977383c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe6ec003c10 fd 38 proxy ignored for local
Jan 26 04:48:27 np0005595445 systemd[1]: Started Process Core Dump (PID 108613/UID 0).
Jan 26 04:48:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:27.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:27 np0005595445 python3.9[108694]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:28 np0005595445 python3.9[108847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:28 np0005595445 systemd-coredump[108618]: Process 103732 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007fe79115d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:48:28 np0005595445 python3.9[108925]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:28 np0005595445 systemd[1]: systemd-coredump@2-108613-0.service: Deactivated successfully.
Jan 26 04:48:28 np0005595445 systemd[1]: systemd-coredump@2-108613-0.service: Consumed 1.524s CPU time.
Jan 26 04:48:28 np0005595445 podman[108930]: 2026-01-26 09:48:28.739252573 +0000 UTC m=+0.027130971 container died cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 26 04:48:28 np0005595445 systemd[1]: var-lib-containers-storage-overlay-ccab391d2ddf76ae48998672269732e3b03ef5875ece04442f9e9329de58aac3-merged.mount: Deactivated successfully.
Jan 26 04:48:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:28.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:28 np0005595445 podman[108930]: 2026-01-26 09:48:28.853150325 +0000 UTC m=+0.141028723 container remove cc086136e1c432f5ca4718ee9ab857dcab4db1a7efab79790d7b9f2c243517ef (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1)
Jan 26 04:48:28 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:48:29 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:48:29 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.763s CPU time.
Jan 26 04:48:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:29.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:29 np0005595445 python3.9[109125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:29 np0005595445 python3.9[109204]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.abg2wr3h recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:30 np0005595445 python3.9[109356]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:30.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:31.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:31 np0005595445 python3.9[109434]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:32 np0005595445 python3.9[109587]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:48:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:32.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094832 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:48:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:33.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:33 np0005595445 python3[109742]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 04:48:33 np0005595445 python3.9[109895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:34 np0005595445 python3.9[109973]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:34.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:35 np0005595445 python3.9[110125]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:35.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:35 np0005595445 python3.9[110251]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420914.6683211-895-169080307139033/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:36 np0005595445 python3.9[110430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:37.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:37 np0005595445 python3.9[110508]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:37 np0005595445 python3.9[110661]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:38 np0005595445 python3.9[110739]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:39 np0005595445 python3.9[110891]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:39.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:39 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 3.
Jan 26 04:48:39 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:48:39 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.763s CPU time.
Jan 26 04:48:39 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:48:39 np0005595445 podman[111017]: 2026-01-26 09:48:39.405756298 +0000 UTC m=+0.038692540 container create 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:48:39 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:48:39 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:48:39 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:48:39 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:48:39 np0005595445 podman[111017]: 2026-01-26 09:48:39.469197153 +0000 UTC m=+0.102133415 container init 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Jan 26 04:48:39 np0005595445 podman[111017]: 2026-01-26 09:48:39.473909824 +0000 UTC m=+0.106846066 container start 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True)
Jan 26 04:48:39 np0005595445 bash[111017]: 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8
Jan 26 04:48:39 np0005595445 podman[111017]: 2026-01-26 09:48:39.387255377 +0000 UTC m=+0.020191639 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:48:39 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:48:39 np0005595445 python3.9[111006]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:48:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:48:40 np0005595445 python3.9[111226]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:48:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:40.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:41.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:41 np0005595445 python3.9[111381]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:42 np0005595445 python3.9[111534]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:42.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:43.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:43 np0005595445 python3.9[111686]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:44 np0005595445 python3.9[111839]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 04:48:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:44 np0005595445 python3.9[111991]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 04:48:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:45.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:45 np0005595445 systemd[1]: session-43.scope: Deactivated successfully.
Jan 26 04:48:45 np0005595445 systemd[1]: session-43.scope: Consumed 29.700s CPU time.
Jan 26 04:48:45 np0005595445 systemd-logind[783]: Session 43 logged out. Waiting for processes to exit.
Jan 26 04:48:45 np0005595445 systemd-logind[783]: Removed session 43.
Jan 26 04:48:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:48:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:48:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:46.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:47.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:48.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:49.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:50.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:51 np0005595445 systemd-logind[783]: New session 44 of user zuul.
Jan 26 04:48:51 np0005595445 systemd[1]: Started Session 44 of User zuul.
Jan 26 04:48:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:51.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:48:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094851 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:48:51 np0005595445 python3.9[112187]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 04:48:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:52.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:52 np0005595445 python3.9[112339]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:48:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f578c000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57780016e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:53.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:53 np0005595445 python3.9[112499]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 26 04:48:54 np0005595445 python3.9[112651]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.or619ee0 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:48:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:48:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:54.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:48:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:54 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094855 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:48:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5790001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:55.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:55 np0005595445 python3.9[112776]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.or619ee0 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769420934.1628942-103-51088690058875/.source.or619ee0 _original_basename=.9b015ofb follow=False checksum=e638a8a1231bcbc6594aeda119d676a260ed9e9f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:48:56 np0005595445 python3.9[112954]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:48:56 np0005595445 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 04:48:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57900023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:48:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:57.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:48:57 np0005595445 python3.9[113108]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0TZpcPGqQPKNdLKsJSWd1uRV3wOVDiIo3gYwVWAuH5m+Wvpw34ZI+6+d4y3DWMqDRZVWAVV0NNFB+b4MQeivx4S7KMCvBctzJ6VIyUDL5NZrwys0sYPH+33ncdZd6C8LrfCvIct+DbWCx72RQ+G0yRbYK1r/m5+dzW2411NqWn8kJkBUeLJIqT2vhFoNpO8NaWSVlWEgl5YunYEPS4v5NSM88ke6Gzc5X5sjxsz65REj6/1BXsA+quwcTAe/KC1/1Rr2cufefwf0uayM6sGuUDATjWIw36YqUeL9wc/IDdIEFEvj2hr/v+r6laaKMidOYJXBiQwIWpgWCOosSj4vrPQmDfqjOa8sAn7yWPVgxyARccavEO89zV2lpFcYTdqegPxjB90lD3Q1pMU6veJUWTRo0LAZ6n9rsRBgF0Mhr75T32Lbqf3KBro6/nPrp1XCD08mNv2cEYwp+put7vwvHzN1nPztqMsIDAMJMupwI+Buyr3xCPHe3hcAavahF+YM=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINbUUMKlV4hksqDn2YVVAHPCHip80h7zj0rReM94Ja2l#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFtD30BOt1BlR6BYm8DU7sxF5fAzZ/aciKetiRsXWlbsXS3Z4mVG1ZAF9AhArV+OaapsLeaQFybIC0e2fudJfos=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyi0WEBS9Gc5Xay4vqFSdv0cJGdtezg+CrNF/vjEeF3l4EhpAAj7XRLEhEU1kz0DDKkzclG65hBNPO4/9cfzEa31EsSmzOqjqZp5ri20HVDkiZlUTTklhrbJGydUw6mcy+rIN1qsUugVHwkA9ufZLvzm9wvljzL+WPt1o41GT42NdNzyfPfnqf7HMDziNUNUUZjqsoy+DQnlMl3c3NHiGysPJ6IssbLBCFzPdBHpEYmR8b44qlJEhx3RYWl3QLcXAyoK7VpPdFO4ltMT+0KVVbLO9IUrocCQ4HfafPn/mV1Rq3phDWvCTRfRo07Mu4Oc4XBu+RIk9tt1WTIdT/ZusPUNSkFgprdU9zFIHLR0KyIX4qRSuWBeB20Ic5pvkRvNtwLB8lPt4NVi7bmun6moO8nu6cOjJ61CCAobDSEL/Z2cG3ADucjCSKtWLM0eSdt6T71NmULMhdB8ljIK4em/NCf/qZWjYr70WKyIZ9b8N5lDO8NF1tbPJyu+O0ebq/JN8=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAaib//yQ1QyvWijjfui4OBtTtMt7Dos+hlx8rucs2Tn#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7YXsQWyEQWSdy5tcEAtltn11CwuaqW/S8S3OB1580hTlcLZWLPDHbzSwNDf13HBG9wgLFgmueLB8U6J7wvvcM=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDm+Vrn31pimz+Of4pkRaSS+qazCMrOF2INZ0EZsyoNG5922K2xwdC9F6r4k2L54HPEpDiazPoDsOHQvs1I+CvayNM2D+8hZhvqxZOMimP8b056aM14nht9ADrJUnlaDs57FkgIKQdxma9I0sW8Up3bbLchFOj2grOjH7gRdUBxblzIS01/P5NV8/kPsRXDoCgx+QAxU2nEqyCQd0JXLKoy+v6t+pG7We9wFXXr2z4XmAx7yeU0Y6NsJ1Seies0apLTmfK3HAtj/3LObvZegqVGDFtl5spotTmJdPJUCZhniaUmyYZ4jtIEno86Bf8OhS3NvLsxmNXuJcInlmCHGXDP9FPBrxG+yVB63FUAeyejCXntEyOzXFp8fiCuOVQuqDTWB4UxTRYh3EqVruxhY1taarew/VfsxIAxv6BWsqtvh/6xtRtJ9vTSDHsDTRaOcChfT5BnATFJ+Ilwpve8C4bjRVdlStH+99TgtNPOg2Fxf8scyIHInM9c4Yn7g8YTiyk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICrJdFptF1rp2hjeKcc0nSEhHvDtAYFU4gfqZN6U+WTb#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNa2lKVjuYCljd0rl1qDkTP3ZoTV9fkbcXvtxSizwygrF6dU+RWdeB3LOkT5U/2GTJuWvOqxJBc3Y1d0b3Dj5Do=#012 create=True mode=0644 path=/tmp/ansible.or619ee0 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.851291) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937851370, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 742, "num_deletes": 251, "total_data_size": 1444235, "memory_usage": 1472544, "flush_reason": "Manual Compaction"}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937861433, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 953382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12466, "largest_seqno": 13203, "table_properties": {"data_size": 949852, "index_size": 1374, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7899, "raw_average_key_size": 18, "raw_value_size": 942764, "raw_average_value_size": 2239, "num_data_blocks": 61, "num_entries": 421, "num_filter_entries": 421, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420887, "oldest_key_time": 1769420887, "file_creation_time": 1769420937, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 10212 microseconds, and 5596 cpu microseconds.
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.861506) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 953382 bytes OK
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.861528) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862789) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862803) EVENT_LOG_v1 {"time_micros": 1769420937862800, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.862816) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1440326, prev total WAL file size 1440326, number of live WAL files 2.
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.863330) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(931KB)], [24(14MB)]
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937863404, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 16286279, "oldest_snapshot_seqno": -1}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4232 keys, 12878908 bytes, temperature: kUnknown
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937931401, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12878908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12847661, "index_size": 19553, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108391, "raw_average_key_size": 25, "raw_value_size": 12767355, "raw_average_value_size": 3016, "num_data_blocks": 828, "num_entries": 4232, "num_filter_entries": 4232, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769420937, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.931901) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12878908 bytes
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.955570) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.3 rd, 188.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 14.6 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(30.6) write-amplify(13.5) OK, records in: 4748, records dropped: 516 output_compression: NoCompression
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.955616) EVENT_LOG_v1 {"time_micros": 1769420937955597, "job": 12, "event": "compaction_finished", "compaction_time_micros": 68345, "compaction_time_cpu_micros": 27868, "output_level": 6, "num_output_files": 1, "total_output_size": 12878908, "num_input_records": 4748, "num_output_records": 4232, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937955907, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769420937958645, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.863254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:57 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:48:57.958770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:48:58 np0005595445 python3.9[113261]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.or619ee0' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:48:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:48:58.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:48:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:48:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:48:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:48:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:48:59.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:48:59 np0005595445 python3.9[113415]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.or619ee0 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:48:59 np0005595445 systemd[1]: session-44.scope: Deactivated successfully.
Jan 26 04:48:59 np0005595445 systemd[1]: session-44.scope: Consumed 4.991s CPU time.
Jan 26 04:48:59 np0005595445 systemd-logind[783]: Session 44 logged out. Waiting for processes to exit.
Jan 26 04:48:59 np0005595445 systemd-logind[783]: Removed session 44.
Jan 26 04:49:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:00 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:49:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57900023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:01.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:03.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:49:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:49:04 np0005595445 systemd-logind[783]: New session 45 of user zuul.
Jan 26 04:49:04 np0005595445 systemd[1]: Started Session 45 of User zuul.
Jan 26 04:49:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:04.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5768000e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:05.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:05 np0005595445 python3.9[113599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:49:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:06 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:49:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:06.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5768001920 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:07 np0005595445 python3.9[113756]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 04:49:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:07.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:08 np0005595445 python3.9[113911]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:49:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:08.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760000d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c001cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:09 np0005595445 python3.9[114064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:49:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:09.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:10 np0005595445 python3.9[114219]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:49:10 np0005595445 python3.9[114371]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:10.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:11.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:11 np0005595445 systemd[1]: session-45.scope: Deactivated successfully.
Jan 26 04:49:11 np0005595445 systemd[1]: session-45.scope: Consumed 3.665s CPU time.
Jan 26 04:49:11 np0005595445 systemd-logind[783]: Session 45 logged out. Waiting for processes to exit.
Jan 26 04:49:11 np0005595445 systemd-logind[783]: Removed session 45.
Jan 26 04:49:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094911 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:49:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:12.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c001cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:13.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:14.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:15 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57700038f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:15.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:16 np0005595445 systemd-logind[783]: New session 46 of user zuul.
Jan 26 04:49:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:16 np0005595445 systemd[1]: Started Session 46 of User zuul.
Jan 26 04:49:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:17 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:17.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:17 np0005595445 python3.9[114627]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:49:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:49:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:18 np0005595445 python3.9[114815]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:49:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:18.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5760001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:19 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c002e10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:49:19 np0005595445 python3.9[114899]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 04:49:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:19.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:20.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:21 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600030a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:21.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:21 np0005595445 python3.9[115053]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:49:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:22.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:23 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:23.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:23 np0005595445 python3.9[115230]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 04:49:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:23 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:49:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:49:24 np0005595445 python3.9[115382]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:49:24 np0005595445 python3.9[115532]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:49:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:24.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:25 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:25.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:25 np0005595445 systemd[1]: session-46.scope: Deactivated successfully.
Jan 26 04:49:25 np0005595445 systemd[1]: session-46.scope: Consumed 6.149s CPU time.
Jan 26 04:49:25 np0005595445 systemd-logind[783]: Session 46 logged out. Waiting for processes to exit.
Jan 26 04:49:25 np0005595445 systemd-logind[783]: Removed session 46.
Jan 26 04:49:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:27 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:27.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:28.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:29 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:29.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:30 np0005595445 systemd-logind[783]: New session 47 of user zuul.
Jan 26 04:49:30 np0005595445 systemd[1]: Started Session 47 of User zuul.
Jan 26 04:49:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:49:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:30.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:49:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:31 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:31.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:31 np0005595445 python3.9[115715]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:49:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:32.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:33 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003a70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:33 np0005595445 python3.9[115872]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:33.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:33 np0005595445 python3.9[116025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:34 np0005595445 python3.9[116177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:34.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:35 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:35.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:35 np0005595445 python3.9[116301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420974.070227-149-96041922822908/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=bea7362bba0757b80ba784e089b88d8ab88c30d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:36 np0005595445 python3.9[116467]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:36 np0005595445 python3.9[116602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420975.650681-149-203657932792799/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=679f96efe917c3889f556f17807e671104af52ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:36.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:37 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:37 np0005595445 python3.9[116754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:37 np0005595445 python3.9[116878]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420976.9063451-149-223982833928585/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=cb3c2f57a5234f1a1ac2af8e96022b0fed20bf3d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:38 np0005595445 python3.9[117030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:38.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:39 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:39.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:39 np0005595445 python3.9[117182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:49:39 np0005595445 python3.9[117335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:40 np0005595445 python3.9[117458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420979.5086622-330-264527805643681/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=9cafdd1360dab1b4d60bfa8e86f8276e76706ef0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:40.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:41 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57680011d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:41 np0005595445 python3.9[117610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:41.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:41 np0005595445 python3.9[117735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420980.6596706-330-225209645518720/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=3190a221de07992f337d3e4a96f47a3d3dd4b35b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:42 np0005595445 python3.9[117887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/094942 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:49:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:42.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:43.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 50 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:43 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f576c003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:44 np0005595445 python3.9[118011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420981.886361-330-114505023901714/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=93aa6b8988d194c9c88d8fb17c0ff48744cb9801 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:44 np0005595445 python3.9[118170]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:44.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840020e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:45 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003c80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:45.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:45 np0005595445 python3.9[118322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:46 np0005595445 python3.9[118475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:46 np0005595445 python3.9[118598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420985.6209319-528-122858067030783/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=4415a399963cf83213fc16f9ed5bbf28ef091eab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:46.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:47 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002ea0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:47.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:47 np0005595445 python3.9[118750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:47 np0005595445 python3.9[118874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420986.7746-528-105226788449037/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=3190a221de07992f337d3e4a96f47a3d3dd4b35b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:48 np0005595445 python3.9[119026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:48 np0005595445 python3.9[119149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420987.9584496-528-137666898163810/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=d0eed1af3b55a18e8c6edf61338ef59ffee7ebc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:48.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003ca0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:49 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:49:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:49.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:50 np0005595445 python3.9[119302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:50 np0005595445 python3.9[119454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:50.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784002ea0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003cc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:51 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:51.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:51 np0005595445 python3.9[119577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420990.3987231-722-228210027792333/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:49:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:49:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:52 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:49:52 np0005595445 python3.9[119730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:52.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:53 np0005595445 python3.9[119882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:53 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003ce0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:53.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:53 np0005595445 python3.9[120006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420992.509027-797-269279461498938/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:54 np0005595445 python3.9[120158]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:54 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:49:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:54.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:55 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:55 np0005595445 python3.9[120310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:55.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:55 np0005595445 python3.9[120434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420994.6720812-872-67717865404424/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:49:56 np0005595445 python3.9[120611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:56.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:57 np0005595445 python3.9[120763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:57.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:57 : epoch 69773877 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:49:57 np0005595445 python3.9[120887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420996.695903-944-24203934788152/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:49:58 np0005595445 python3.9[121039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:49:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:49:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:49:58.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:49:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5784003bb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:49:59 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:49:59 np0005595445 python3.9[121191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:49:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:49:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:49:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:49:59.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:49:59 np0005595445 python3.9[121315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769420998.7195146-1011-28607873709536/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:00 np0005595445 ceph-mon[80107]: overall HEALTH_OK
Jan 26 04:50:00 np0005595445 python3.9[121467]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:50:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:00.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:01 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:01 np0005595445 python3.9[121619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:50:02 np0005595445 python3.9[121743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421000.9023566-1079-141695714015866/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=a4f71bf0609e75a0e091c9100076ae4c4a7bed4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:02 np0005595445 systemd[1]: session-47.scope: Deactivated successfully.
Jan 26 04:50:02 np0005595445 systemd[1]: session-47.scope: Consumed 23.606s CPU time.
Jan 26 04:50:02 np0005595445 systemd-logind[783]: Session 47 logged out. Waiting for processes to exit.
Jan 26 04:50:02 np0005595445 systemd-logind[783]: Removed session 47.
Jan 26 04:50:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:03 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095004 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:50:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:04.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:05 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:05.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:06.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003d80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:07 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:07.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:07 np0005595445 systemd-logind[783]: New session 48 of user zuul.
Jan 26 04:50:07 np0005595445 systemd[1]: Started Session 48 of User zuul.
Jan 26 04:50:08 np0005595445 python3.9[121928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:08.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:09 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5770003da0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:09.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:09 np0005595445 python3.9[122080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:10 np0005595445 python3.9[122204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421008.860517-58-130279010580214/.source.conf _original_basename=ceph.conf follow=False checksum=d9847d470420fd34212d6cc1f2ab891aeddd27f2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:50:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:10.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:50:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57600039c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5778003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:11 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:11 np0005595445 python3.9[122356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:11.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:11 np0005595445 python3.9[122480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421010.506798-58-88342665704759/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=e8137016e459ec15b04fac1b40fd6c611375a3cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:12 np0005595445 systemd[1]: session-48.scope: Deactivated successfully.
Jan 26 04:50:12 np0005595445 systemd[1]: session-48.scope: Consumed 2.859s CPU time.
Jan 26 04:50:12 np0005595445 systemd-logind[783]: Session 48 logged out. Waiting for processes to exit.
Jan 26 04:50:12 np0005595445 systemd-logind[783]: Removed session 48.
Jan 26 04:50:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:12.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:13 np0005595445 kernel: ganesha.nfsd[116231]: segfault at 50 ip 00007f581686d32e sp 00007f57a17f9210 error 4 in libntirpc.so.5.8[7f5816852000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 26 04:50:13 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:50:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[111032]: 26/01/2026 09:50:13 : epoch 69773877 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f57840048c0 fd 47 proxy ignored for local
Jan 26 04:50:13 np0005595445 systemd[1]: Started Process Core Dump (PID 122505/UID 0).
Jan 26 04:50:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:13.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:14 np0005595445 systemd-coredump[122506]: Process 111037 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 57:#012#0  0x00007f581686d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:50:14 np0005595445 systemd[1]: systemd-coredump@3-122505-0.service: Deactivated successfully.
Jan 26 04:50:14 np0005595445 systemd[1]: systemd-coredump@3-122505-0.service: Consumed 1.246s CPU time.
Jan 26 04:50:14 np0005595445 podman[122512]: 2026-01-26 09:50:14.48765948 +0000 UTC m=+0.047493320 container died 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 26 04:50:14 np0005595445 systemd[1]: var-lib-containers-storage-overlay-b935da6dc4aeecbe481a98d1071edd21957b8052ba6272d61816dfce610d2b23-merged.mount: Deactivated successfully.
Jan 26 04:50:14 np0005595445 podman[122512]: 2026-01-26 09:50:14.532331632 +0000 UTC m=+0.092165502 container remove 1a7be39caead16f4589c4b559c326459ed0a4d717b062f3a71c6fb265aca0cb8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Jan 26 04:50:14 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:50:14 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:50:14 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.691s CPU time.
Jan 26 04:50:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:14.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:15.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:16.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:17.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:17 np0005595445 systemd-logind[783]: New session 49 of user zuul.
Jan 26 04:50:17 np0005595445 systemd[1]: Started Session 49 of User zuul.
Jan 26 04:50:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:18.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095019 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:50:19 np0005595445 python3.9[122737]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:50:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:20 np0005595445 python3.9[122894]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:50:20 np0005595445 python3.9[123046]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:50:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:20.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:21.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 04:50:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 19.65 MB, 0.03 MB/s#012Interval WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slo
Jan 26 04:50:21 np0005595445 python3.9[123197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:50:22 np0005595445 python3.9[123349]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 04:50:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:23.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:23.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:23 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:24 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:50:24 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 4.
Jan 26 04:50:24 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:50:24 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.691s CPU time.
Jan 26 04:50:24 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:50:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:25.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:25 np0005595445 podman[123702]: 2026-01-26 09:50:25.205778927 +0000 UTC m=+0.043673385 container create a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:50:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:50:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:50:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:50:25 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:50:25 np0005595445 podman[123702]: 2026-01-26 09:50:25.275063302 +0000 UTC m=+0.112957760 container init a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:50:25 np0005595445 podman[123702]: 2026-01-26 09:50:25.280020048 +0000 UTC m=+0.117914506 container start a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:50:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:25 np0005595445 podman[123702]: 2026-01-26 09:50:25.189026698 +0000 UTC m=+0.026921176 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:50:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:25.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:25 np0005595445 bash[123702]: a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:50:25 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:50:25 np0005595445 python3.9[123680]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:50:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:50:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:26 np0005595445 python3.9[123845]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:50:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:27.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:27.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:28 np0005595445 python3.9[123999]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:50:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:29.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:29.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:30 np0005595445 python3[124156]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 04:50:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:31.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:31 np0005595445 python3.9[124334]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:31.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:50:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:50:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:50:31 np0005595445 python3.9[124488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:32 np0005595445 python3.9[124566]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:33.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:50:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:33.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:50:33 np0005595445 python3.9[124718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:34 np0005595445 python3.9[124797]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mqvi4rxi recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:34 np0005595445 python3.9[124949]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:35.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:35.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:35 np0005595445 python3.9[125027]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:36 np0005595445 python3.9[125205]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:37.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:50:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:37.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:50:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:50:37 np0005595445 python3[125358]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 04:50:38 np0005595445 python3.9[125523]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:39.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:39 np0005595445 python3.9[125653]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421037.945402-427-65948198053841/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:40 np0005595445 python3.9[125806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:41.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095041 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:50:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:41 np0005595445 python3.9[125931]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421039.5579093-472-108190636931665/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 04:50:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 04:50:42 np0005595445 python3.9[126084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:43.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:43.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:43 np0005595445 python3.9[126209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421041.4479537-517-53879095357632/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:44 np0005595445 python3.9[126362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:44 np0005595445 python3.9[126487]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421043.6279104-562-50147066338492/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:45.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:45.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:45 np0005595445 python3.9[126640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:50:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:46 np0005595445 python3.9[126765]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421045.2046874-607-156071656345633/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:47.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:47.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:47 np0005595445 python3.9[126917]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:48 np0005595445 python3.9[127070]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:50:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:49.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:50:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:49.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:49 np0005595445 python3.9[127225]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:50 np0005595445 python3.9[127380]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:51.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:51.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:51 np0005595445 python3.9[127533]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:50:52 np0005595445 python3.9[127688]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:53 np0005595445 python3.9[127843]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:50:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:53.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095053 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:50:54 np0005595445 python3.9[127994]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:50:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:55.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:56 np0005595445 python3.9[128148]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:56 np0005595445 ovs-vsctl[128149]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 04:50:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:50:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:57.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:57 np0005595445 python3.9[128326]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:50:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:57.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:50:58 np0005595445 python3.9[128482]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:50:58 np0005595445 ovs-vsctl[128483]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 04:50:58 np0005595445 python3.9[128633]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:50:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:50:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:50:59.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:50:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:50:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:50:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:50:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 04:50:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:50:59.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 04:50:59 np0005595445 python3.9[128788]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:00 np0005595445 python3.9[128940]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:01.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:01 np0005595445 python3.9[129018]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:01.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:02 np0005595445 python3.9[129171]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:02 np0005595445 python3.9[129249]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:03.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:51:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:03.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:03 np0005595445 python3.9[129401]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:04 np0005595445 python3.9[129554]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:04 np0005595445 python3.9[129632]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:05.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:05.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 04:51:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2458 writes, 14K keys, 2458 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2458 writes, 2458 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2458 writes, 14K keys, 2458 commit groups, 1.0 writes per commit group, ingest: 37.70 MB, 0.06 MB/s#012Interval WAL: 2458 writes, 2458 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    122.2      0.17              0.06         6    0.028       0      0       0.0       0.0#012  L6      1/0   12.28 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0    145.5    128.5      0.49              0.14         5    0.098     21K   2276       0.0       0.0#012 Sum      1/0   12.28 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0    108.2    126.9      0.66              0.19        11    0.060     21K   2276       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0    108.6    127.3      0.65              0.19        10    0.065     21K   2276       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    145.5    128.5      0.49              0.14         5    0.098     21K   2276       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    124.1      0.17              0.06         5    0.033       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.020#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 2.15 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.00014 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(145,1.95 MB,0.639865%) FilterBlock(11,67.42 KB,0.0216584%) IndexBlock(11,139.77 KB,0.044898%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 04:51:06 np0005595445 python3.9[129785]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:06 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:51:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:06 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:51:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:06 np0005595445 python3.9[129863]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:07.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:07.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:07 np0005595445 python3.9[130015]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:51:07 np0005595445 systemd[1]: Reloading.
Jan 26 04:51:07 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:51:07 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:51:08 np0005595445 python3.9[130206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:09.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:51:09 np0005595445 python3.9[130284]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:09.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:10 np0005595445 python3.9[130437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:10 np0005595445 python3.9[130515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff4800a7e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:11.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:11 np0005595445 python3.9[130672]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:51:11 np0005595445 systemd[1]: Reloading.
Jan 26 04:51:11 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:51:11 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:51:12 np0005595445 systemd[1]: Starting Create netns directory...
Jan 26 04:51:12 np0005595445 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 04:51:12 np0005595445 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 04:51:12 np0005595445 systemd[1]: Finished Create netns directory.
Jan 26 04:51:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:13.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:13 np0005595445 python3.9[130868]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:13.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:14 np0005595445 python3.9[131021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:14 np0005595445 python3.9[131144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421073.5682874-1360-4511416554229/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:15.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:15.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095115 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:51:16 np0005595445 python3.9[131297]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:16 np0005595445 python3.9[131474]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:17.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:17.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:17 np0005595445 python3.9[131628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:18 np0005595445 python3.9[131752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421077.1851642-1459-30485736147614/.source.json _original_basename=.lb_5vyec follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:19.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:19 np0005595445 python3.9[131902]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:19.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:21.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff180025a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:21.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:21 np0005595445 python3.9[132327]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 04:51:22 np0005595445 python3.9[132479]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 04:51:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:23.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:23.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:24 np0005595445 python3[132633]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 04:51:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:25.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:25.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:27.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:27.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:29.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:29 np0005595445 podman[132646]: 2026-01-26 09:51:29.311208781 +0000 UTC m=+4.919707012 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 04:51:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:29.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:29 np0005595445 podman[132766]: 2026-01-26 09:51:29.543831579 +0000 UTC m=+0.071855592 container create 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:51:29 np0005595445 podman[132766]: 2026-01-26 09:51:29.514040286 +0000 UTC m=+0.042064289 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 04:51:29 np0005595445 python3[132633]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 26 04:51:30 np0005595445 python3.9[132957]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:51:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:31.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:31.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:31 np0005595445 python3.9[133194]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:51:32 np0005595445 python3.9[133270]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:51:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:33.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:33.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:51:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:51:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:51:33 np0005595445 python3.9[133421]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421092.5742908-1693-176886658685265/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:34 np0005595445 python3.9[133499]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:51:34 np0005595445 systemd[1]: Reloading.
Jan 26 04:51:34 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:51:34 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:51:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:35.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:35.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:35 np0005595445 python3.9[133612]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:51:35 np0005595445 systemd[1]: Reloading.
Jan 26 04:51:35 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:51:35 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:51:36 np0005595445 systemd[1]: Starting ovn_controller container...
Jan 26 04:51:36 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:51:36 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8bf517f7993a3b8facf5a0730bef250c97d1a838382ce53c490ea79845b577/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 04:51:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:36 np0005595445 systemd[1]: Started /usr/bin/podman healthcheck run 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496.
Jan 26 04:51:36 np0005595445 podman[133654]: 2026-01-26 09:51:36.255855719 +0000 UTC m=+0.173310740 container init 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + sudo -E kolla_set_configs
Jan 26 04:51:36 np0005595445 podman[133654]: 2026-01-26 09:51:36.293538747 +0000 UTC m=+0.210993748 container start 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 04:51:36 np0005595445 edpm-start-podman-container[133654]: ovn_controller
Jan 26 04:51:36 np0005595445 systemd[1]: Created slice User Slice of UID 0.
Jan 26 04:51:36 np0005595445 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 04:51:36 np0005595445 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 04:51:36 np0005595445 systemd[1]: Starting User Manager for UID 0...
Jan 26 04:51:36 np0005595445 edpm-start-podman-container[133653]: Creating additional drop-in dependency for "ovn_controller" (34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496)
Jan 26 04:51:36 np0005595445 podman[133677]: 2026-01-26 09:51:36.390585036 +0000 UTC m=+0.083410708 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 04:51:36 np0005595445 systemd[1]: 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496-88ead25080fd3b5.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 04:51:36 np0005595445 systemd[1]: 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496-88ead25080fd3b5.service: Failed with result 'exit-code'.
Jan 26 04:51:36 np0005595445 systemd[1]: Reloading.
Jan 26 04:51:36 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:51:36 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:51:36 np0005595445 systemd[133709]: Queued start job for default target Main User Target.
Jan 26 04:51:36 np0005595445 systemd[133709]: Created slice User Application Slice.
Jan 26 04:51:36 np0005595445 systemd[133709]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 04:51:36 np0005595445 systemd[133709]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 04:51:36 np0005595445 systemd[133709]: Reached target Paths.
Jan 26 04:51:36 np0005595445 systemd[133709]: Reached target Timers.
Jan 26 04:51:36 np0005595445 systemd[133709]: Starting D-Bus User Message Bus Socket...
Jan 26 04:51:36 np0005595445 systemd[133709]: Starting Create User's Volatile Files and Directories...
Jan 26 04:51:36 np0005595445 systemd[133709]: Finished Create User's Volatile Files and Directories.
Jan 26 04:51:36 np0005595445 systemd[133709]: Listening on D-Bus User Message Bus Socket.
Jan 26 04:51:36 np0005595445 systemd[133709]: Reached target Sockets.
Jan 26 04:51:36 np0005595445 systemd[133709]: Reached target Basic System.
Jan 26 04:51:36 np0005595445 systemd[133709]: Reached target Main User Target.
Jan 26 04:51:36 np0005595445 systemd[133709]: Startup finished in 170ms.
Jan 26 04:51:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095136 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:51:36 np0005595445 systemd[1]: Started User Manager for UID 0.
Jan 26 04:51:36 np0005595445 systemd[1]: Started ovn_controller container.
Jan 26 04:51:36 np0005595445 systemd[1]: Started Session c1 of User root.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: INFO:__main__:Validating config file
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: INFO:__main__:Writing out command to execute
Jan 26 04:51:36 np0005595445 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: ++ cat /run_command
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + ARGS=
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + sudo kolla_copy_cacerts
Jan 26 04:51:36 np0005595445 systemd[1]: Started Session c2 of User root.
Jan 26 04:51:36 np0005595445 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + [[ ! -n '' ]]
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + . kolla_extend_start
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + umask 0022
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9151] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9163] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <warn>  [1769421096.9169] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9183] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9193] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9203] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 04:51:36 np0005595445 kernel: br-int: entered promiscuous mode
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 04:51:36 np0005595445 ovn_controller[133670]: 2026-01-26T09:51:36Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9478] manager: (ovn-80993c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9488] manager: (ovn-f90cdf-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 04:51:36 np0005595445 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9718] device (genev_sys_6081): carrier: link connected
Jan 26 04:51:36 np0005595445 NetworkManager[49073]: <info>  [1769421096.9723] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 04:51:36 np0005595445 systemd-udevd[133828]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:51:36 np0005595445 systemd-udevd[133829]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 04:51:37 np0005595445 NetworkManager[49073]: <info>  [1769421097.0385] manager: (ovn-8128a1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 04:51:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:37.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:37 np0005595445 python3.9[133960]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 04:51:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:51:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:39.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:51:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000081s ======
Jan 26 04:51:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:39.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000081s
Jan 26 04:51:39 np0005595445 python3.9[134129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:40 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:51:40 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:51:40 np0005595445 python3.9[134278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421098.4445858-1828-142697491935149/.source.yaml _original_basename=.918c_9n7 follow=False checksum=da37636c5844ad86706b9cbdcceae3b87fc97017 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:51:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:41 np0005595445 python3.9[134430]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:51:41 np0005595445 ovs-vsctl[134431]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 04:51:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:41.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095141 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:51:42 np0005595445 python3.9[134584]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:51:42 np0005595445 ovs-vsctl[134586]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 04:51:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:43.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:44 np0005595445 python3.9[134740]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:51:44 np0005595445 ovs-vsctl[134741]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 04:51:44 np0005595445 systemd[1]: session-49.scope: Deactivated successfully.
Jan 26 04:51:44 np0005595445 systemd[1]: session-49.scope: Consumed 1min 8.360s CPU time.
Jan 26 04:51:44 np0005595445 systemd-logind[783]: Session 49 logged out. Waiting for processes to exit.
Jan 26 04:51:44 np0005595445 systemd-logind[783]: Removed session 49.
Jan 26 04:51:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:45.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:51:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:46 np0005595445 systemd[1]: Stopping User Manager for UID 0...
Jan 26 04:51:46 np0005595445 systemd[133709]: Activating special unit Exit the Session...
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped target Main User Target.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped target Basic System.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped target Paths.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped target Sockets.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped target Timers.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 04:51:46 np0005595445 systemd[133709]: Closed D-Bus User Message Bus Socket.
Jan 26 04:51:46 np0005595445 systemd[133709]: Stopped Create User's Volatile Files and Directories.
Jan 26 04:51:46 np0005595445 systemd[133709]: Removed slice User Application Slice.
Jan 26 04:51:46 np0005595445 systemd[133709]: Reached target Shutdown.
Jan 26 04:51:46 np0005595445 systemd[133709]: Finished Exit the Session.
Jan 26 04:51:46 np0005595445 systemd[133709]: Reached target Exit the Session.
Jan 26 04:51:46 np0005595445 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 04:51:46 np0005595445 systemd[1]: Stopped User Manager for UID 0.
Jan 26 04:51:46 np0005595445 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 04:51:47 np0005595445 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 04:51:47 np0005595445 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 04:51:47 np0005595445 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 04:51:47 np0005595445 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 04:51:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:47.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:48 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:51:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:48 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:51:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:51:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:51:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:49.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:49 np0005595445 systemd-logind[783]: New session 51 of user zuul.
Jan 26 04:51:49 np0005595445 systemd[1]: Started Session 51 of User zuul.
Jan 26 04:51:50 np0005595445 python3.9[134924]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:51:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:51.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:51:52 np0005595445 python3.9[135081]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:52 np0005595445 python3.9[135233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:53.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:53 np0005595445 python3.9[135386]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:54 np0005595445 python3.9[135538]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:51:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:55 np0005595445 python3.9[135691]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:51:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:51:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:51:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:51:56 np0005595445 python3.9[135844]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:51:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:57.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:57 np0005595445 python3.9[136021]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 04:51:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:51:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:57.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:51:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:58 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:51:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:58 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:51:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095158 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:51:58 np0005595445 python3.9[136172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:51:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:51:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:51:59.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:51:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:51:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:51:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:51:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:51:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:51:59.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:51:59 np0005595445 python3.9[136294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421118.1652422-214-224095567021030/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:00 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:52:00 np0005595445 python3.9[136446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:01 np0005595445 python3.9[136567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421119.9583406-259-86040440830638/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:01.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:01.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:02 np0005595445 python3.9[136720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:52:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:03 np0005595445 python3.9[136804]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:52:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:03.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:03.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095203 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:52:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:05.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:05.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:05 np0005595445 python3.9[136960]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:52:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:06 np0005595445 python3.9[137114]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:06 np0005595445 ovn_controller[133670]: 2026-01-26T09:52:06Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Jan 26 04:52:06 np0005595445 ovn_controller[133670]: 2026-01-26T09:52:06Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 26 04:52:06 np0005595445 podman[137209]: 2026-01-26 09:52:06.963924556 +0000 UTC m=+0.101715683 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:52:07 np0005595445 python3.9[137247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421125.9850605-370-128080199012704/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:07.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:07.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:07 np0005595445 python3.9[137412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:08 np0005595445 python3.9[137533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421127.3099568-370-245166052308011/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:09.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:09.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:09 np0005595445 python3.9[137684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:10 np0005595445 python3.9[137805]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421129.300026-502-19400771539884/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:11 np0005595445 python3.9[137955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:11.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:11 np0005595445 python3.9[138077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421130.6141384-502-243027666593439/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:12 np0005595445 python3.9[138227]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:52:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:13 np0005595445 python3.9[138382]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:14 np0005595445 python3.9[138534]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:15 np0005595445 python3.9[138612]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:15.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:15.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:15 np0005595445 python3.9[138765]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:16 np0005595445 python3.9[138843]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:17.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:17 np0005595445 python3.9[139019]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:17.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:18 np0005595445 python3.9[139175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:18 np0005595445 python3.9[139253]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:19.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:19 np0005595445 python3.9[139405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:19.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:19 np0005595445 python3.9[139484]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:20 np0005595445 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:52:20 np0005595445 systemd[1]: Reloading.
Jan 26 04:52:20 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:52:20 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:52:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004060 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:21.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:21 np0005595445 python3.9[139826]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:22 np0005595445 python3.9[139904]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:23 np0005595445 python3.9[140056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:23.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:23 np0005595445 python3.9[140135]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:24 np0005595445 python3.9[140287]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:52:24 np0005595445 systemd[1]: Reloading.
Jan 26 04:52:24 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:52:24 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:52:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:25.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:25 np0005595445 systemd[1]: Starting Create netns directory...
Jan 26 04:52:25 np0005595445 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 04:52:25 np0005595445 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 04:52:25 np0005595445 systemd[1]: Finished Create netns directory.
Jan 26 04:52:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:26 np0005595445 python3.9[140482]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:27.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:27.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:27 np0005595445 python3.9[140635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:28 np0005595445 python3.9[140759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421147.0174866-955-33889155433279/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:29 np0005595445 python3.9[140911]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:29.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:30 np0005595445 python3.9[141064]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:52:30 np0005595445 python3.9[141216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:31.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:31.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:31 np0005595445 python3.9[141339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421150.4518435-1054-13784981376938/.source.json _original_basename=.mhwyc9ru follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:32 np0005595445 python3.9[141490]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:33.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:33.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:35 np0005595445 python3.9[141914]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 04:52:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400040e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:35.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:35.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:36 np0005595445 python3.9[142067]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 04:52:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:37 np0005595445 podman[142209]: 2026-01-26 09:52:37.206785298 +0000 UTC m=+0.096640342 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 26 04:52:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:37.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004100 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:37 np0005595445 python3[142259]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 04:52:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:37.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:39.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:39.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095240 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 2ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:52:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004120 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:41.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:52:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:41.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:52:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:43.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:43.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:45.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:45.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:52:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:52:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:52:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:52:45 np0005595445 podman[142284]: 2026-01-26 09:52:45.885891018 +0000 UTC m=+8.346917562 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 04:52:46 np0005595445 podman[142499]: 2026-01-26 09:52:46.050877847 +0000 UTC m=+0.061087709 container create 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:52:46 np0005595445 podman[142499]: 2026-01-26 09:52:46.015444398 +0000 UTC m=+0.025654280 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 04:52:46 np0005595445 python3[142259]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 04:52:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:47 np0005595445 python3.9[142689]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:52:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000055s ======
Jan 26 04:52:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:47.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 26 04:52:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:47.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:47 np0005595445 python3.9[142844]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:48 np0005595445 python3.9[142920]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:52:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:52:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:49.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:49 np0005595445 python3.9[143071]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421168.5923753-1288-184615256381360/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:49.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:49 np0005595445 python3.9[143149]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:52:49 np0005595445 systemd[1]: Reloading.
Jan 26 04:52:50 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:52:50 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:52:50 np0005595445 python3.9[143261]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:52:51 np0005595445 systemd[1]: Reloading.
Jan 26 04:52:51 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:52:51 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:52:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:51.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:51.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:51 np0005595445 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 04:52:51 np0005595445 systemd[1]: Started libcrun container.
Jan 26 04:52:51 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb44cd548efab5b6a01ac0e4aed17369054a220407ab2fc8e605797c1a0444eb/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 04:52:51 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb44cd548efab5b6a01ac0e4aed17369054a220407ab2fc8e605797c1a0444eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 04:52:51 np0005595445 systemd[1]: Started /usr/bin/podman healthcheck run 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98.
Jan 26 04:52:51 np0005595445 podman[143305]: 2026-01-26 09:52:51.91638672 +0000 UTC m=+0.307572271 container init 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: + sudo -E kolla_set_configs
Jan 26 04:52:51 np0005595445 podman[143305]: 2026-01-26 09:52:51.949563047 +0000 UTC m=+0.340748588 container start 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:52:51 np0005595445 edpm-start-podman-container[143305]: ovn_metadata_agent
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Validating config file
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Copying service configuration files
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Writing out command to execute
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 04:52:51 np0005595445 ovn_metadata_agent[143321]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: ++ cat /run_command
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + CMD=neutron-ovn-metadata-agent
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + ARGS=
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + sudo kolla_copy_cacerts
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + [[ ! -n '' ]]
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + . kolla_extend_start
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + umask 0022
Jan 26 04:52:52 np0005595445 ovn_metadata_agent[143321]: + exec neutron-ovn-metadata-agent
Jan 26 04:52:52 np0005595445 podman[143328]: 2026-01-26 09:52:52.055885466 +0000 UTC m=+0.086600175 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:52:52 np0005595445 edpm-start-podman-container[143304]: Creating additional drop-in dependency for "ovn_metadata_agent" (6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98)
Jan 26 04:52:52 np0005595445 systemd[1]: Reloading.
Jan 26 04:52:52 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:52:52 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:52:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:52 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:52:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:52 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:52:52 np0005595445 systemd[1]: Started ovn_metadata_agent container.
Jan 26 04:52:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:52:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:52:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:53.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:53.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.873 143326 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.873 143326 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.874 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.875 143326 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.876 143326 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.877 143326 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.878 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.879 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.880 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.881 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.882 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.883 143326 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.884 143326 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.885 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.886 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.887 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.888 143326 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.889 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.890 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.891 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.892 143326 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.893 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.894 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.895 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.896 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.897 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.898 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.899 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.900 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.901 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.902 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.903 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.904 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.905 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.906 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.907 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.908 143326 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.917 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.918 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.918 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.934 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 5f259fb6-5896-4c89-8853-1dd537a2ebf7 (UUID: 5f259fb6-5896-4c89-8853-1dd537a2ebf7) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.958 143326 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.959 143326 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.962 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.972 143326 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.979 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '5f259fb6-5896-4c89-8853-1dd537a2ebf7'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], external_ids={}, name=5f259fb6-5896-4c89-8853-1dd537a2ebf7, nb_cfg_timestamp=1769421104940, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.980 143326 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f24b0ca0f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.981 143326 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.982 143326 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.988 143326 DEBUG oslo_service.service [-] Started child 143586 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.992 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpsb9r0wpn/privsep.sock']#033[00m
Jan 26 04:52:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:53.994 143586 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-245215'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.032 143586 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.033 143586 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.033 143586 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.038 143586 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.046 143586 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.052 143586 INFO eventlet.wsgi.server [-] (143586) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 26 04:52:54 np0005595445 python3.9[143585]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 04:52:54 np0005595445 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.714 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.715 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpsb9r0wpn/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.568 143615 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.572 143615 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.574 143615 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.575 143615 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143615#033[00m
Jan 26 04:52:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:54.720 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[38a28a7a-1f01-4f38-98f2-84aa8d5e0750]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 04:52:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.242 143615 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:52:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:52:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:55.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:55 np0005595445 python3.9[143747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:52:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:55.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.834 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[d008b7ba-10ee-4d98-81f6-74cbf2349b4d]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.839 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, column=external_ids, values=({'neutron:ovn-metadata-id': 'b7f82c07-4108-5b1a-8b28-274a8ea4043b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.851 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.860 143326 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.860 143326 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.861 143326 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.862 143326 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.863 143326 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.864 143326 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.865 143326 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.866 143326 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.867 143326 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.868 143326 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.868 143326 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.869 143326 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.869 143326 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.870 143326 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.870 143326 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.871 143326 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.872 143326 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.873 143326 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.873 143326 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.874 143326 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.875 143326 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 python3.9[143873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421174.7532618-1423-171386248859386/.source.yaml _original_basename=.ag3tdlg7 follow=False checksum=e4cba382ee426a679e5ef46b4fc246a694e7130c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.876 143326 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.877 143326 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.877 143326 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.878 143326 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.879 143326 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.880 143326 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.881 143326 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.882 143326 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.883 143326 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.884 143326 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.885 143326 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.886 143326 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.886 143326 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.887 143326 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.887 143326 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.888 143326 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.889 143326 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.890 143326 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.891 143326 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.892 143326 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.893 143326 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.894 143326 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.895 143326 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.896 143326 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.897 143326 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.897 143326 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.898 143326 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.898 143326 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.899 143326 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.900 143326 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.900 143326 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.901 143326 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.902 143326 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.903 143326 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.903 143326 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.904 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.905 143326 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.906 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.907 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.908 143326 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.909 143326 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.910 143326 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.911 143326 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.912 143326 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.913 143326 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.914 143326 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.915 143326 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.916 143326 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.917 143326 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.918 143326 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.919 143326 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.920 143326 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.921 143326 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.922 143326 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.923 143326 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.924 143326 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.925 143326 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.926 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.927 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.928 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.929 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 04:52:55 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:52:55.930 143326 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 04:52:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:52:56 np0005595445 systemd[1]: session-51.scope: Deactivated successfully.
Jan 26 04:52:56 np0005595445 systemd[1]: session-51.scope: Consumed 1min 2.975s CPU time.
Jan 26 04:52:56 np0005595445 systemd-logind[783]: Session 51 logged out. Waiting for processes to exit.
Jan 26 04:52:56 np0005595445 systemd-logind[783]: Removed session 51.
Jan 26 04:52:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:57.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:57.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:52:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:52:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:52:59.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:52:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:52:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:52:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:52:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:52:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:52:59.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095300 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:53:01 np0005595445 systemd-logind[783]: New session 52 of user zuul.
Jan 26 04:53:01 np0005595445 systemd[1]: Started Session 52 of User zuul.
Jan 26 04:53:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:01.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:01.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:02 np0005595445 python3.9[144081]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:53:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:03.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:04 np0005595445 python3.9[144238]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:05.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:53:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:05.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:53:05 np0005595445 python3.9[144403]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:53:05 np0005595445 systemd[1]: Reloading.
Jan 26 04:53:05 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:53:05 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:53:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:07 np0005595445 python3.9[144589]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:53:07 np0005595445 network[144606]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:53:07 np0005595445 network[144607]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:53:07 np0005595445 network[144608]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:53:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:07.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:07.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:08 np0005595445 podman[144616]: 2026-01-26 09:53:08.08180841 +0000 UTC m=+0.138118368 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 04:53:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:09.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:09.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:11.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:11.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:13.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:53:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:13.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:53:14 np0005595445 python3.9[144902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003e80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:15.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:15.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:15 np0005595445 python3.9[145056]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:16 np0005595445 python3.9[145209]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:17.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:17.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:17 np0005595445 python3.9[145367]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:18 np0005595445 python3.9[145541]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:19.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:19.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:19 np0005595445 python3.9[145695]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:20 np0005595445 python3.9[145848]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:53:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:53:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:21.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:53:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:21.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:22 np0005595445 podman[145875]: 2026-01-26 09:53:22.319861183 +0000 UTC m=+0.083532400 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:53:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:53:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:23.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:53:24 np0005595445 python3.9[146022]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:24 np0005595445 python3.9[146174]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:25.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:25.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:25 np0005595445 python3.9[146327]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:26 np0005595445 python3.9[146479]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:27 np0005595445 python3.9[146631]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:27.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:27.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:27 np0005595445 python3.9[146784]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:28 np0005595445 python3.9[146936]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:29.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:29 np0005595445 python3.9[147088]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:29.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:30 np0005595445 python3.9[147241]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:31 np0005595445 python3.9[147393]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:31.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:31.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:31 np0005595445 python3.9[147546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:32 np0005595445 python3.9[147698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000055s ======
Jan 26 04:53:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:33.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Jan 26 04:53:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:33.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:33 np0005595445 python3.9[147850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:34 np0005595445 python3.9[148003]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:53:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 04:53:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:35.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 04:53:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:35.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:35 np0005595445 python3.9[148157]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:36 np0005595445 python3.9[148310]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 04:53:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:37.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:37.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:37 np0005595445 python3.9[148488]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:53:37 np0005595445 systemd[1]: Reloading.
Jan 26 04:53:38 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:53:38 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:53:38 np0005595445 podman[148523]: 2026-01-26 09:53:38.405436686 +0000 UTC m=+0.126490997 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:53:39 np0005595445 python3.9[148701]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:39.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:39.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:39 np0005595445 python3.9[148855]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:41.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:41.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:41 np0005595445 python3.9[149009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:42 np0005595445 python3.9[149163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20003f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:43 np0005595445 python3.9[149317]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:43.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:53:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:43.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:53:44 np0005595445 python3.9[149473]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:44 np0005595445 python3.9[149626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:53:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:45.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:45.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:46 np0005595445 python3.9[149782]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 04:53:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:47.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:47 np0005595445 python3.9[149935]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 04:53:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:47.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:48 np0005595445 python3.9[150094]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 04:53:48 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:53:48 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 04:53:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:49.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:49 np0005595445 python3.9[150256]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:53:50 np0005595445 python3.9[150340]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:53:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:51.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:52 np0005595445 podman[150373]: 2026-01-26 09:53:52.70576095 +0000 UTC m=+0.065399158 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 04:53:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:53.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:53.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.911 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:53:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.912 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:53:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:53:53.912 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:53:54 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:53:54 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:53:54 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:53:54 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:53:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:55.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:53:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:57.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:53:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:53:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:53:59.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:53:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:53:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004020 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:53:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:53:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:53:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:53:59.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:54:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:54:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:01.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40002480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:54:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:01.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:54:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:03.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:03.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:54:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:05.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:54:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:05.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:07.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:07.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:09 np0005595445 podman[150687]: 2026-01-26 09:54:09.35398483 +0000 UTC m=+0.129598515 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 26 04:54:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:09.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:09.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:11.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:11.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095412 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:54:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095412 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:54:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:13.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:13.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:15.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:15.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff10001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:17.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:17.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:18 np0005595445 kernel: SELinux:  Converting 2778 SID table entries...
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:54:18 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:54:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18004040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:19.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:19.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:20 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:54:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:21.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:21.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:23 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 04:54:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:23 np0005595445 podman[150758]: 2026-01-26 09:54:23.325296834 +0000 UTC m=+0.081824685 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 04:54:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:23.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:54:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:54:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:54:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:54:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:54:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:25.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:54:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:54:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:28 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:54:28 np0005595445 kernel: SELinux:  Converting 2778 SID table entries...
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:54:28 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:54:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff1800bd80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:29.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:31.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095432 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:54:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:33.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:33.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095434 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:54:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:35.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:37.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:37 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 04:54:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:39.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000053s ======
Jan 26 04:54:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:39.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Jan 26 04:54:40 np0005595445 podman[150823]: 2026-01-26 09:54:40.336841268 +0000 UTC m=+0.105312059 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 04:54:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:41.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:41.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:54:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:43.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:54:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:43.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:45.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:47.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:47.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:49.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:49.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:51.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:51.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff400049e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:54:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:53.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:54:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.913 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:54:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:54:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:54:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:54:54 np0005595445 podman[158004]: 2026-01-26 09:54:54.314363271 +0000 UTC m=+0.075960093 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 04:54:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:55.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:54:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:57.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:54:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:54:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff20001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:54:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:54:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:54:59.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:54:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:54:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:54:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:54:59.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 04:55:00 np0005595445 podman[161346]: 2026-01-26 09:55:00.292143627 +0000 UTC m=+0.141941575 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 26 04:55:00 np0005595445 podman[161346]: 2026-01-26 09:55:00.401257004 +0000 UTC m=+0.251054942 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 04:55:01 np0005595445 podman[161916]: 2026-01-26 09:55:01.102163682 +0000 UTC m=+0.083888400 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:55:01 np0005595445 podman[161916]: 2026-01-26 09:55:01.11818426 +0000 UTC m=+0.099908938 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 04:55:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24004590 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:01 np0005595445 podman[162153]: 2026-01-26 09:55:01.418498716 +0000 UTC m=+0.061134966 container exec a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:55:01 np0005595445 podman[162153]: 2026-01-26 09:55:01.436236879 +0000 UTC m=+0.078873099 container exec_died a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 26 04:55:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:01.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:01 np0005595445 podman[162363]: 2026-01-26 09:55:01.670998664 +0000 UTC m=+0.054774009 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:55:01 np0005595445 podman[162363]: 2026-01-26 09:55:01.684020254 +0000 UTC m=+0.067795589 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 04:55:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:01.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:01 np0005595445 podman[162575]: 2026-01-26 09:55:01.914797766 +0000 UTC m=+0.056225538 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.openshift.expose-services=, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=2.2.4, architecture=x86_64, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 04:55:01 np0005595445 podman[162575]: 2026-01-26 09:55:01.928281798 +0000 UTC m=+0.069709560 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, architecture=x86_64, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., vcs-type=git, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, version=2.2.4)
Jan 26 04:55:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:55:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:03.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c002910 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:05.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:55:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:05.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:55:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:07.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:07.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004a80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:09.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:09.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:11 np0005595445 podman[167922]: 2026-01-26 09:55:11.214662462 +0000 UTC m=+0.089169277 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 26 04:55:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:11.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:11.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:12 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:12 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:55:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:13.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:13.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100036f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:15.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:55:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:15.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:55:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ae0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:17.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:17.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:19.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095521 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:55:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:21.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:21.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:23.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:25 np0005595445 podman[168462]: 2026-01-26 09:55:25.321909157 +0000 UTC m=+0.086081938 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 04:55:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:25.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:25.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:27.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:55:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:27.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:55:28 np0005595445 kernel: SELinux:  Converting 2779 SID table entries...
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability open_perms=1
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability always_check_network=0
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 04:55:28 np0005595445 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 04:55:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:29.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:29.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:55:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:31.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:32 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:55:32 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 04:55:32 np0005595445 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 26 04:55:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:33 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:33.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:33.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:55:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:55:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:35 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:35.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:35.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:37 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004be0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:37.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:38 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:55:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff480089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:39 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:39.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:39.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:40 np0005595445 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 04:55:40 np0005595445 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 04:55:40 np0005595445 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 04:55:40 np0005595445 systemd[1]: sshd.service: Consumed 7.777s CPU time, read 32.0K from disk, written 112.0K to disk.
Jan 26 04:55:40 np0005595445 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 04:55:40 np0005595445 systemd[1]: Stopping sshd-keygen.target...
Jan 26 04:55:40 np0005595445 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:55:40 np0005595445 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:55:40 np0005595445 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 04:55:40 np0005595445 systemd[1]: Reached target sshd-keygen.target.
Jan 26 04:55:40 np0005595445 systemd[1]: Starting OpenSSH server daemon...
Jan 26 04:55:40 np0005595445 systemd[1]: Started OpenSSH server daemon.
Jan 26 04:55:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:41 np0005595445 podman[169451]: 2026-01-26 09:55:41.416013642 +0000 UTC m=+0.140604830 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 04:55:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:41 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:41.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095543 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:55:43 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:55:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:43 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:55:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:43 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:43 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:43 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:43 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:43 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:55:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:43.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008b70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:45 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:45.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:45.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:47 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:47.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:47.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:49 np0005595445 python3.9[174938]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:55:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:49 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:49 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:49 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:49 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:49.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:49.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:50 np0005595445 python3.9[176180]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:55:50 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:50 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:50 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:51 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:51.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:51 np0005595445 python3.9[177542]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:55:51 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:51 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:51 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:51.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:52 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:55:52 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:55:52 np0005595445 systemd[1]: man-db-cache-update.service: Consumed 11.852s CPU time.
Jan 26 04:55:52 np0005595445 systemd[1]: run-r30ecc3fc361a49d4ae0e2a05cdb3fce2.service: Deactivated successfully.
Jan 26 04:55:52 np0005595445 python3.9[178773]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:55:52 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:52 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:52 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:53 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:55:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:53.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:55:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:53.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.915 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:55:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.916 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:55:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:55:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:55:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003620 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004c20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:55 np0005595445 python3.9[178994]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:55:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:55 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:55.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:55 np0005595445 podman[178997]: 2026-01-26 09:55:55.616065417 +0000 UTC m=+0.079957208 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 04:55:55 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:55 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:55 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:55.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:56 np0005595445 python3.9[179204]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:55:56 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:57 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:57 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003640 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:57 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:57.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:55:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:57.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:58 np0005595445 python3.9[179395]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:55:58 np0005595445 systemd[1]: Reloading.
Jan 26 04:55:58 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:55:58 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:55:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:55:59 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:55:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:55:59.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:55:59 np0005595445 python3.9[179611]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:55:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:55:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:55:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:55:59.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:00 np0005595445 python3.9[179767]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:00 np0005595445 systemd[1]: Reloading.
Jan 26 04:56:00 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:56:00 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:56:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:01 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:01.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:01.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:02 np0005595445 python3.9[179959]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 04:56:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:02 np0005595445 systemd[1]: Reloading.
Jan 26 04:56:02 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:56:02 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:56:02 np0005595445 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 04:56:02 np0005595445 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 04:56:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:03 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:03.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:03.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:03 np0005595445 python3.9[180153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:04 np0005595445 python3.9[180308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:05 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:05.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:06 np0005595445 python3.9[180464]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:07 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:07.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:07 np0005595445 python3.9[180619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:07.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:08 np0005595445 python3.9[180775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:09 np0005595445 python3.9[180930]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:09 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:09.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:09.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:10 np0005595445 python3.9[181086]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095611 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:56:11 np0005595445 python3.9[181241]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:11 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 04:56:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:11.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 04:56:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:11.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:11 np0005595445 podman[181436]: 2026-01-26 09:56:11.930674435 +0000 UTC m=+0.094143108 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 04:56:12 np0005595445 python3.9[181501]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:13 np0005595445 python3.9[181658]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:13 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:13.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:56:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:56:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:56:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:56:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:13.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:15 np0005595445 python3.9[181814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:15 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:15.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:15.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:17 np0005595445 python3.9[181970]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:17 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:17.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:17.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:18 np0005595445 python3.9[182127]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:18 np0005595445 python3.9[182333]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 04:56:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:56:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:56:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff100047f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:19 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:56:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:19.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:19.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:20 np0005595445 python3.9[182489]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:20 np0005595445 python3.9[182641]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff48008d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:21 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c0036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:21 np0005595445 python3.9[182795]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:21.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:21.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:22 np0005595445 python3.9[182948]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:56:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:56:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:22 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:56:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:22 np0005595445 python3.9[183100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.080030) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383080085, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4703, "num_deletes": 502, "total_data_size": 12917563, "memory_usage": 13092048, "flush_reason": "Manual Compaction"}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383146958, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8366266, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13208, "largest_seqno": 17906, "table_properties": {"data_size": 8348437, "index_size": 12083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36561, "raw_average_key_size": 19, "raw_value_size": 8311857, "raw_average_value_size": 4475, "num_data_blocks": 528, "num_entries": 1857, "num_filter_entries": 1857, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420938, "oldest_key_time": 1769420938, "file_creation_time": 1769421383, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 66955 microseconds, and 27408 cpu microseconds.
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.146997) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8366266 bytes OK
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.147014) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149341) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149358) EVENT_LOG_v1 {"time_micros": 1769421383149354, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.149396) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12897232, prev total WAL file size 12897232, number of live WAL files 2.
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.153141) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8170KB)], [27(12MB)]
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383153237, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21245174, "oldest_snapshot_seqno": -1}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5067 keys, 15580489 bytes, temperature: kUnknown
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383277106, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15580489, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15541645, "index_size": 25102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12677, "raw_key_size": 126680, "raw_average_key_size": 25, "raw_value_size": 15444863, "raw_average_value_size": 3048, "num_data_blocks": 1057, "num_entries": 5067, "num_filter_entries": 5067, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421383, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.277520) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15580489 bytes
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.279084) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.4 rd, 125.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(8.0, 12.3 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(4.4) write-amplify(1.9) OK, records in: 6089, records dropped: 1022 output_compression: NoCompression
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.279113) EVENT_LOG_v1 {"time_micros": 1769421383279100, "job": 14, "event": "compaction_finished", "compaction_time_micros": 123986, "compaction_time_cpu_micros": 33416, "output_level": 6, "num_output_files": 1, "total_output_size": 15580489, "num_input_records": 6089, "num_output_records": 5067, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383281931, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421383286615, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.152969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:23.286686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff40004f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:23 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff18002250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:23 np0005595445 python3.9[183253]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 04:56:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:23.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:24 np0005595445 python3.9[183403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:56:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:25 np0005595445 python3.9[183555]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:25 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:56:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:25.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:25.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:26 np0005595445 podman[183655]: 2026-01-26 09:56:26.047489478 +0000 UTC m=+0.092049243 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 26 04:56:26 np0005595445 python3.9[183702]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421384.816253-1642-192164753084171/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:26 np0005595445 python3.9[183854]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:27 np0005595445 python3.9[183979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421386.3845415-1642-137592759078000/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:27 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:27.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:27.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.869740) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387869799, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 289, "num_deletes": 250, "total_data_size": 124510, "memory_usage": 130432, "flush_reason": "Manual Compaction"}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387872745, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 81538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17911, "largest_seqno": 18195, "table_properties": {"data_size": 79610, "index_size": 156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5188, "raw_average_key_size": 19, "raw_value_size": 75849, "raw_average_value_size": 281, "num_data_blocks": 7, "num_entries": 269, "num_filter_entries": 269, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421384, "oldest_key_time": 1769421384, "file_creation_time": 1769421387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 3033 microseconds, and 1041 cpu microseconds.
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.872782) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 81538 bytes OK
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.872802) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874299) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874316) EVENT_LOG_v1 {"time_micros": 1769421387874311, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874335) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 122369, prev total WAL file size 122369, number of live WAL files 2.
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874760) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(79KB)], [30(14MB)]
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387874819, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 15662027, "oldest_snapshot_seqno": -1}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4828 keys, 11602143 bytes, temperature: kUnknown
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387968773, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11602143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11569428, "index_size": 19549, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12101, "raw_key_size": 122148, "raw_average_key_size": 25, "raw_value_size": 11481336, "raw_average_value_size": 2378, "num_data_blocks": 814, "num_entries": 4828, "num_filter_entries": 4828, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.969011) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11602143 bytes
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.970539) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.6 rd, 123.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 14.9 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(334.4) write-amplify(142.3) OK, records in: 5336, records dropped: 508 output_compression: NoCompression
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.970558) EVENT_LOG_v1 {"time_micros": 1769421387970549, "job": 16, "event": "compaction_finished", "compaction_time_micros": 94018, "compaction_time_cpu_micros": 28674, "output_level": 6, "num_output_files": 1, "total_output_size": 11602143, "num_input_records": 5336, "num_output_records": 4828, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387970688, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421387974338, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.874686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974398) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974399) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:27 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:56:27.974401) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:56:28 np0005595445 python3.9[184132]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:28 np0005595445 python3.9[184257]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421387.7144494-1642-191245863988166/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff24001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:29 np0005595445 python3.9[184409]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:29 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:29.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:29.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:30 np0005595445 python3.9[184535]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421388.9698806-1642-32913076201140/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:30 np0005595445 python3.9[184687]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095631 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:56:31 np0005595445 kernel: ganesha.nfsd[163468]: segfault at 50 ip 00007effcac6e32e sp 00007eff38ff8210 error 4 in libntirpc.so.5.8[7effcac53000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 26 04:56:31 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:56:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[123719]: 26/01/2026 09:56:31 : epoch 697738e1 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7eff3c003700 fd 38 proxy ignored for local
Jan 26 04:56:31 np0005595445 systemd[1]: Started Process Core Dump (PID 184813/UID 0).
Jan 26 04:56:31 np0005595445 python3.9[184812]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421390.260303-1642-201401313200375/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:31.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:31.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:32 np0005595445 python3.9[184967]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:32 np0005595445 systemd-coredump[184814]: Process 123723 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 74:#012#0  0x00007effcac6e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:56:32 np0005595445 systemd[1]: systemd-coredump@4-184813-0.service: Deactivated successfully.
Jan 26 04:56:32 np0005595445 systemd[1]: systemd-coredump@4-184813-0.service: Consumed 1.239s CPU time.
Jan 26 04:56:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:32 np0005595445 podman[185097]: 2026-01-26 09:56:32.704729441 +0000 UTC m=+0.045421596 container died a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 04:56:32 np0005595445 systemd[1]: var-lib-containers-storage-overlay-50ad3cd59e5c5d65126cb5de338a6d9e374cbf09eb3d2dd944710879fec095a1-merged.mount: Deactivated successfully.
Jan 26 04:56:32 np0005595445 podman[185097]: 2026-01-26 09:56:32.747321152 +0000 UTC m=+0.088013307 container remove a0e54dfdc80639d6a858a302b9137bdb7eb4ae31fc0c5c95cbf6faee0fa517e4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 04:56:32 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:56:32 np0005595445 python3.9[185093]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421391.6138182-1642-202640730484380/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:32 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:56:32 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 2.384s CPU time.
Jan 26 04:56:33 np0005595445 python3.9[185292]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:33.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:33.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:34 np0005595445 python3.9[185416]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421393.0110707-1642-162888978341113/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:34 np0005595445 python3.9[185570]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:35 np0005595445 python3.9[185695]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769421394.3208778-1642-198377813614528/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:35.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095636 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:56:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095637 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:56:37 np0005595445 python3.9[185848]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 04:56:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:37.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:37.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:38 np0005595445 python3.9[186004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:39 np0005595445 python3.9[186181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:39.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:39.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:40 np0005595445 python3.9[186334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:40 np0005595445 python3.9[186486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:41 np0005595445 python3.9[186638]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:56:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:41.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:56:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:41.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:42 np0005595445 python3.9[186791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:42 np0005595445 podman[186801]: 2026-01-26 09:56:42.303435152 +0000 UTC m=+0.078783279 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 26 04:56:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:42 np0005595445 python3.9[186967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:42 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 5.
Jan 26 04:56:42 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:56:42 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 2.384s CPU time.
Jan 26 04:56:42 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:56:43 np0005595445 podman[187084]: 2026-01-26 09:56:43.171683066 +0000 UTC m=+0.059991879 container create f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:56:43 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:56:43 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:56:43 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:56:43 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:56:43 np0005595445 podman[187084]: 2026-01-26 09:56:43.234054679 +0000 UTC m=+0.122363482 container init f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 04:56:43 np0005595445 podman[187084]: 2026-01-26 09:56:43.24111374 +0000 UTC m=+0.129422543 container start f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 04:56:43 np0005595445 bash[187084]: f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3
Jan 26 04:56:43 np0005595445 podman[187084]: 2026-01-26 09:56:43.150180212 +0000 UTC m=+0.038489075 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:56:43 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:56:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:56:43 np0005595445 python3.9[187222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:43.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:43.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:44 np0005595445 python3.9[187375]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:44 np0005595445 python3.9[187527]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:45 np0005595445 python3.9[187679]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:45.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:45.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:46 np0005595445 auditd[700]: Audit daemon rotating log files
Jan 26 04:56:46 np0005595445 python3.9[187832]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:47 np0005595445 python3.9[187984]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:47.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:47 np0005595445 python3.9[188137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:47.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:56:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:56:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:49.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:56:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:49.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:56:50 np0005595445 python3.9[188290]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:50 np0005595445 python3.9[188413]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421409.541111-2305-105801427888392/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:51 np0005595445 python3.9[188565]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:56:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:51.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:56:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:56:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:56:52 np0005595445 python3.9[188689]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421410.9390295-2305-157806444756689/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:52 np0005595445 python3.9[188841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:53 np0005595445 python3.9[188964]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421412.2024925-2305-102792998276836/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:53.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.917 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:56:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.918 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:56:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:56:53.918 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:56:53 np0005595445 python3.9[189117]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:54 np0005595445 python3.9[189240]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421413.445095-2305-24697996549822/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:55 np0005595445 python3.9[189392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd110000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:55 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:55 np0005595445 python3.9[189531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421414.7412686-2305-26470474788187/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:55.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:56 np0005595445 podman[189631]: 2026-01-26 09:56:56.30894044 +0000 UTC m=+0.069932259 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 04:56:56 np0005595445 python3.9[189703]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:57 np0005595445 python3.9[189826]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421415.9786768-2305-257256167709928/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8000da0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:57 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:57.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:56:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:57.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:58 np0005595445 python3.9[189979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095658 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:56:58 np0005595445 python3.9[190127]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421417.404612-2305-154213677875054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:56:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095659 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:56:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:59 np0005595445 python3.9[190279]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:56:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:56:59 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:56:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:56:59.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:56:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:56:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:56:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:56:59.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:00 np0005595445 python3.9[190403]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421418.8900807-2305-92383411143530/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:00 np0005595445 python3.9[190555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:01 np0005595445 python3.9[190678]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421420.258999-2305-106366403099726/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:01 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:01.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:02 np0005595445 python3.9[190831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:02 np0005595445 python3.9[190954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421421.6240442-2305-272277287284044/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f8001ea0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:03 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:03 np0005595445 python3.9[191107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999955s ======
Jan 26 04:57:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:03.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999955s
Jan 26 04:57:04 np0005595445 python3.9[191230]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421423.0960348-2305-147349478184363/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:05 np0005595445 python3.9[191382]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc001ab0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:05 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:05.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:06 np0005595445 python3.9[191506]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421424.7271461-2305-186578840990222/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:06 np0005595445 python3.9[191658]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:07 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:07 np0005595445 python3.9[191781]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421426.3383253-2305-240542570516077/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:07.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:08 np0005595445 python3.9[191936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:09 np0005595445 python3.9[192059]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421427.9182436-2305-241615058389781/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:09 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:09.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:10.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:10 np0005595445 python3.9[192210]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:11 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:11.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:11 np0005595445 python3.9[192366]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 04:57:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999955s ======
Jan 26 04:57:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:12.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999955s
Jan 26 04:57:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095713 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:57:13 np0005595445 dbus-broker-launch[769]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 04:57:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:13 np0005595445 podman[192395]: 2026-01-26 09:57:13.40109565 +0000 UTC m=+0.157416331 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 04:57:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:13 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:13.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:13 np0005595445 python3.9[192550]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:14.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:14 np0005595445 python3.9[192702]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:15 np0005595445 python3.9[192854]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:15 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:15.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:16.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:16 np0005595445 python3.9[193007]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:17 np0005595445 python3.9[193159]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:17 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:17.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:17 np0005595445 python3.9[193314]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:18.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:18 np0005595445 python3.9[193540]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:19 np0005595445 python3.9[193724]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:19 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:19.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:20.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:20 np0005595445 python3.9[193877]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:21 np0005595445 python3.9[194029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:21 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:22 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:57:22 np0005595445 python3.9[194182]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:57:22 np0005595445 systemd[1]: Reloading.
Jan 26 04:57:22 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:57:22 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:57:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:22 np0005595445 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 04:57:22 np0005595445 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 04:57:22 np0005595445 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 04:57:22 np0005595445 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 04:57:22 np0005595445 systemd[1]: Starting libvirt logging daemon...
Jan 26 04:57:22 np0005595445 systemd[1]: Started libvirt logging daemon.
Jan 26 04:57:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:57:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:23 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:57:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:23 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001999912s ======
Jan 26 04:57:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:23.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001999912s
Jan 26 04:57:23 np0005595445 python3.9[194375]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:57:23 np0005595445 systemd[1]: Reloading.
Jan 26 04:57:23 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:57:23 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:57:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:24.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:24 np0005595445 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 04:57:24 np0005595445 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 04:57:24 np0005595445 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 04:57:24 np0005595445 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 04:57:24 np0005595445 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 04:57:24 np0005595445 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 04:57:24 np0005595445 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 04:57:24 np0005595445 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 04:57:24 np0005595445 systemd[1]: Started libvirt nodedev daemon.
Jan 26 04:57:24 np0005595445 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 04:57:24 np0005595445 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 04:57:24 np0005595445 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 04:57:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:57:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:57:25 np0005595445 python3.9[194602]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:57:25 np0005595445 systemd[1]: Reloading.
Jan 26 04:57:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:25 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:57:25 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:57:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f4004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:25 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:25 np0005595445 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 04:57:25 np0005595445 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 04:57:25 np0005595445 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 04:57:25 np0005595445 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 04:57:25 np0005595445 systemd[1]: Starting libvirt proxy daemon...
Jan 26 04:57:25 np0005595445 systemd[1]: Started libvirt proxy daemon.
Jan 26 04:57:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999956s ======
Jan 26 04:57:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:25.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999956s
Jan 26 04:57:25 np0005595445 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a476e045-7b7a-4a8e-94a5-35aae6c63fc9
Jan 26 04:57:25 np0005595445 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 26 04:57:25 np0005595445 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a476e045-7b7a-4a8e-94a5-35aae6c63fc9
Jan 26 04:57:25 np0005595445 setroubleshoot[194415]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 26 04:57:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:26.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:26 np0005595445 python3.9[194817]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:57:26 np0005595445 systemd[1]: Reloading.
Jan 26 04:57:26 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:57:26 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:57:26 np0005595445 podman[194819]: 2026-01-26 09:57:26.784493739 +0000 UTC m=+0.111848067 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 04:57:27 np0005595445 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 04:57:27 np0005595445 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 04:57:27 np0005595445 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 04:57:27 np0005595445 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 04:57:27 np0005595445 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 04:57:27 np0005595445 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 04:57:27 np0005595445 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 04:57:27 np0005595445 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 04:57:27 np0005595445 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 04:57:27 np0005595445 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 04:57:27 np0005595445 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 04:57:27 np0005595445 systemd[1]: Started libvirt QEMU daemon.
Jan 26 04:57:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:27 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd104001cd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:27.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:28.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:28 np0005595445 python3.9[195054]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:57:28 np0005595445 systemd[1]: Reloading.
Jan 26 04:57:28 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:57:28 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:57:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:28 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:57:28 np0005595445 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 04:57:28 np0005595445 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 04:57:28 np0005595445 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 04:57:28 np0005595445 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 04:57:28 np0005595445 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 04:57:28 np0005595445 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 04:57:28 np0005595445 systemd[1]: Starting libvirt secret daemon...
Jan 26 04:57:28 np0005595445 systemd[1]: Started libvirt secret daemon.
Jan 26 04:57:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:29 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:57:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:29 np0005595445 python3.9[195291]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:29 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:29.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:30.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:30 np0005595445 python3.9[195444]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 04:57:31 np0005595445 python3.9[195596]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:31 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:31.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:32.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:32 np0005595445 python3.9[195751]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 04:57:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:33 np0005595445 python3.9[195901]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd104001fd0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:33 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc002fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:33.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:33 np0005595445 python3.9[196023]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421452.6144001-3379-247936633090530/.source.xml follow=False _original_basename=secret.xml.j2 checksum=8bb860fb1574c7989940fddd89a1bc8580864aba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:34.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:34 np0005595445 python3.9[196175]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 1a70b85d-e3fd-5814-8a6a-37ea00fcae30#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095735 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:57:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:35 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:35.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:35 np0005595445 python3.9[196338]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:36 np0005595445 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 04:57:36 np0005595445 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.152s CPU time.
Jan 26 04:57:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:36.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:36 np0005595445 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 04:57:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:37 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:57:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:37.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:57:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:38.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:38 np0005595445 python3.9[196802]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:39 np0005595445 python3.9[196979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:39 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:39.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:40.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:40 np0005595445 python3.9[197103]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421458.8163602-3544-230765070869054/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:41 np0005595445 python3.9[197255]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:41 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:41.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:57:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:42.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:57:42 np0005595445 python3.9[197408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:42 np0005595445 python3.9[197486]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:43 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:43 np0005595445 podman[197611]: 2026-01-26 09:57:43.654488443 +0000 UTC m=+0.104179637 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 04:57:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:43.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:43 np0005595445 python3.9[197661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:44.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:44 np0005595445 python3.9[197743]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wjj7r4l8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:45 np0005595445 python3.9[197895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0f80027c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:45 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:57:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:45.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:57:45 np0005595445 python3.9[197974]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:46.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:46 np0005595445 python3.9[198126]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:47 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:47.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:47 np0005595445 python3[198280]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 04:57:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:48.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:48 np0005595445 python3.9[198432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:49 np0005595445 python3.9[198510]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e8003db0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd1040028f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:49 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:57:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:49.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:50.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:50 np0005595445 python3.9[198663]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:50 np0005595445 python3.9[198788]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421469.5502386-3811-173750776529031/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:51 np0005595445 kernel: ganesha.nfsd[189468]: segfault at 50 ip 00007fd192f2d32e sp 00007fd118ff8210 error 4 in libntirpc.so.5.8[7fd192f12000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 04:57:51 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:57:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[187128]: 26/01/2026 09:57:51 : epoch 69773a5b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0fc003e70 fd 37 proxy ignored for local
Jan 26 04:57:51 np0005595445 systemd[1]: Started Process Core Dump (PID 198911/UID 0).
Jan 26 04:57:51 np0005595445 python3.9[198943]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:51.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:52.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:52 np0005595445 python3.9[199021]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:52 np0005595445 systemd-coredump[198913]: Process 187149 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007fd192f2d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:57:52 np0005595445 systemd[1]: systemd-coredump@5-198911-0.service: Deactivated successfully.
Jan 26 04:57:52 np0005595445 systemd[1]: systemd-coredump@5-198911-0.service: Consumed 1.218s CPU time.
Jan 26 04:57:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:52 np0005595445 podman[199050]: 2026-01-26 09:57:52.749455206 +0000 UTC m=+0.047676070 container died f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 04:57:52 np0005595445 systemd[1]: var-lib-containers-storage-overlay-7db4ed4d191b8d4606088a1397a8878c3b2be7577ad0f488114cb7161674c534-merged.mount: Deactivated successfully.
Jan 26 04:57:52 np0005595445 podman[199050]: 2026-01-26 09:57:52.799785506 +0000 UTC m=+0.098006310 container remove f91999dbe8ac46def68a1db73fc621f26c01213fbdcc9e21e2564db7aefa11b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 26 04:57:52 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:57:53 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:57:53 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.690s CPU time.
Jan 26 04:57:53 np0005595445 python3.9[199219]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.919 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:57:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:57:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:57:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:57:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:53.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:57:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:54.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:57:54 np0005595445 python3.9[199298]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:55 np0005595445 python3.9[199450]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:57:55 np0005595445 python3.9[199576]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769421474.5091093-3928-51725510179494/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:55.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:57:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:56.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:57:56 np0005595445 python3.9[199728]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:57 np0005595445 podman[199753]: 2026-01-26 09:57:57.312034395 +0000 UTC m=+0.084852354 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 04:57:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095757 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:57:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:57:57 np0005595445 python3.9[199902]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:57.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:57:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:57:58.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:57:58 np0005595445 python3.9[200080]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:57:59 np0005595445 python3.9[200235]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:57:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:57:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:57:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:57:59.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:58:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 04:58:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 04:58:00 np0005595445 python3.9[200390]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:58:01 np0005595445 python3.9[200545]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:58:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:58:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:01.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:58:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:02.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:02 np0005595445 python3.9[200700]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:03 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 6.
Jan 26 04:58:03 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:58:03 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.690s CPU time.
Jan 26 04:58:03 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:58:03 np0005595445 podman[200902]: 2026-01-26 09:58:03.466108183 +0000 UTC m=+0.051168334 container create 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:58:03 np0005595445 python3.9[200873]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:58:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:03 np0005595445 podman[200902]: 2026-01-26 09:58:03.44306077 +0000 UTC m=+0.028120911 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:58:03 np0005595445 podman[200902]: 2026-01-26 09:58:03.555216121 +0000 UTC m=+0.140276252 container init 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 26 04:58:03 np0005595445 podman[200902]: 2026-01-26 09:58:03.564471952 +0000 UTC m=+0.149532073 container start 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 04:58:03 np0005595445 bash[200902]: 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba
Jan 26 04:58:03 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:58:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:03 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:58:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:03 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:58:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:58:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:04.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:04 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:58:04 np0005595445 python3.9[201060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421482.8390465-4144-196388026304540/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:05 np0005595445 python3.9[201234]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:58:05 np0005595445 python3.9[201358]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421484.5003583-4189-57149880805252/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:05.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:06.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:06 np0005595445 python3.9[201510]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:58:07 np0005595445 python3.9[201633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421486.0091622-4234-35548270332264/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:08.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:08 np0005595445 python3.9[201786]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:58:08 np0005595445 systemd[1]: Reloading.
Jan 26 04:58:08 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:58:08 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:58:08 np0005595445 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 04:58:09 np0005595445 python3.9[201976]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 04:58:09 np0005595445 systemd[1]: Reloading.
Jan 26 04:58:09 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:58:09 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:58:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:10 np0005595445 systemd[1]: Reloading.
Jan 26 04:58:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:10.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:10 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:58:10 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:58:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:10 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:58:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:10 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:58:10 np0005595445 systemd[1]: session-52.scope: Deactivated successfully.
Jan 26 04:58:10 np0005595445 systemd[1]: session-52.scope: Consumed 3min 56.093s CPU time.
Jan 26 04:58:10 np0005595445 systemd-logind[783]: Session 52 logged out. Waiting for processes to exit.
Jan 26 04:58:10 np0005595445 systemd-logind[783]: Removed session 52.
Jan 26 04:58:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:14.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:14 np0005595445 podman[202078]: 2026-01-26 09:58:14.35405376 +0000 UTC m=+0.122663052 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 04:58:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095815 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:58:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:16.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:58:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:16 : epoch 69773aab : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:58:16 np0005595445 systemd-logind[783]: New session 53 of user zuul.
Jan 26 04:58:16 np0005595445 systemd[1]: Started Session 53 of User zuul.
Jan 26 04:58:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:17 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31c0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:17 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:17 np0005595445 python3.9[202274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:58:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:18 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31a4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:19 np0005595445 python3.9[202454]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:58:19 np0005595445 network[202471]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:58:19 np0005595445 network[202472]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:58:19 np0005595445 network[202473]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:58:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095819 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:58:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:19 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f319c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:19 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31c4001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:20 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:20.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:21 np0005595445 kernel: ganesha.nfsd[202111]: segfault at 50 ip 00007f324c40932e sp 00007f31d27fb210 error 4 in libntirpc.so.5.8[7f324c3ee000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 26 04:58:21 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 04:58:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[200918]: 26/01/2026 09:58:21 : epoch 69773aab : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f31b8002520 fd 38 proxy ignored for local
Jan 26 04:58:21 np0005595445 systemd[1]: Started Process Core Dump (PID 202534/UID 0).
Jan 26 04:58:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:22.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:22 np0005595445 systemd-coredump[202535]: Process 200926 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f324c40932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 04:58:22 np0005595445 systemd[1]: systemd-coredump@6-202534-0.service: Deactivated successfully.
Jan 26 04:58:22 np0005595445 systemd[1]: systemd-coredump@6-202534-0.service: Consumed 1.146s CPU time.
Jan 26 04:58:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:22 np0005595445 podman[202552]: 2026-01-26 09:58:22.684971921 +0000 UTC m=+0.051553829 container died 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:58:22 np0005595445 systemd[1]: var-lib-containers-storage-overlay-517cd0d582ca3a4460bb9bc745e6e8235de97ea25035af73605c7c836f342957-merged.mount: Deactivated successfully.
Jan 26 04:58:22 np0005595445 podman[202552]: 2026-01-26 09:58:22.731099559 +0000 UTC m=+0.097681407 container remove 47a793f86a4439f31eda22b535baa072091a02712074afe081570a9bf67e7cba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 04:58:22 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 04:58:22 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 04:58:22 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.697s CPU time.
Jan 26 04:58:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:24.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:25 np0005595445 python3.9[202798]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 04:58:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:26.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:26 np0005595445 python3.9[202883]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:58:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095827 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:58:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:28.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:28.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:28 np0005595445 podman[202886]: 2026-01-26 09:58:28.312899848 +0000 UTC m=+0.095745730 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 04:58:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:30.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:30 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:58:30 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:58:30 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:58:30 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:58:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:32.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:32.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:32 np0005595445 python3.9[203140]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:58:33 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 7.
Jan 26 04:58:33 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:58:33 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.697s CPU time.
Jan 26 04:58:33 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 04:58:33 np0005595445 podman[203289]: 2026-01-26 09:58:33.559859285 +0000 UTC m=+0.058404942 container create cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 04:58:33 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:33 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:33 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:33 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 04:58:33 np0005595445 podman[203289]: 2026-01-26 09:58:33.540796551 +0000 UTC m=+0.039342228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 04:58:33 np0005595445 podman[203289]: 2026-01-26 09:58:33.634956333 +0000 UTC m=+0.133502040 container init cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 04:58:33 np0005595445 podman[203289]: 2026-01-26 09:58:33.641207532 +0000 UTC m=+0.139753199 container start cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 26 04:58:33 np0005595445 bash[203289]: cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 04:58:33 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 04:58:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:58:33 np0005595445 python3.9[203356]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:58:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:58:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:58:34 np0005595445 python3.9[203573]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:58:35 np0005595445 python3.9[203725]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:58:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:36.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:36.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:36 np0005595445 python3.9[203879]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:58:36 np0005595445 python3.9[204002]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421515.7588308-241-59159069202668/.source.iscsi _original_basename=.pj379ukk follow=False checksum=e8ca1be981ffe6ba52507fee4e91187479634914 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:38.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:38 np0005595445 python3.9[204155]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:38 np0005595445 python3.9[204307]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:58:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 04:58:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:58:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:58:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:58:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:40.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:40 np0005595445 python3.9[204485]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:58:40 np0005595445 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 04:58:41 np0005595445 python3.9[204641]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:58:41 np0005595445 systemd[1]: Reloading.
Jan 26 04:58:41 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:58:41 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:58:41 np0005595445 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 04:58:41 np0005595445 systemd[1]: Starting Open-iSCSI...
Jan 26 04:58:41 np0005595445 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 04:58:41 np0005595445 systemd[1]: Started Open-iSCSI.
Jan 26 04:58:41 np0005595445 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 04:58:41 np0005595445 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 04:58:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:58:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:42.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:58:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095843 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:58:43 np0005595445 python3.9[204840]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:58:43 np0005595445 network[204857]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:58:43 np0005595445 network[204858]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:58:43 np0005595445 network[204859]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:58:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:44.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:44.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:44 np0005595445 podman[204875]: 2026-01-26 09:58:44.549537013 +0000 UTC m=+0.136528262 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000001a:nfs.cephfs.0: -2
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 04:58:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 04:58:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:46.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:46.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b44000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:48.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095849 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:58:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:50.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:50.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:51 np0005595445 python3.9[205179]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:58:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 04:58:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:52.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 04:58:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:52.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.964670) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532964730, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1603, "num_deletes": 255, "total_data_size": 4107087, "memory_usage": 4167248, "flush_reason": "Manual Compaction"}
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532984900, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2675326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18200, "largest_seqno": 19798, "table_properties": {"data_size": 2668688, "index_size": 3773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13136, "raw_average_key_size": 18, "raw_value_size": 2655560, "raw_average_value_size": 3804, "num_data_blocks": 169, "num_entries": 698, "num_filter_entries": 698, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421388, "oldest_key_time": 1769421388, "file_creation_time": 1769421532, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 20297 microseconds, and 6359 cpu microseconds.
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.984964) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2675326 bytes OK
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.984992) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986543) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986560) EVENT_LOG_v1 {"time_micros": 1769421532986555, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.986632) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 4099775, prev total WAL file size 4099775, number of live WAL files 2.
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.987882) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2612KB)], [33(11MB)]
Jan 26 04:58:52 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421532987953, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14277469, "oldest_snapshot_seqno": -1}
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 5002 keys, 13831980 bytes, temperature: kUnknown
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533054339, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13831980, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13796599, "index_size": 21767, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126827, "raw_average_key_size": 25, "raw_value_size": 13704085, "raw_average_value_size": 2739, "num_data_blocks": 897, "num_entries": 5002, "num_filter_entries": 5002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421532, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.055290) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13831980 bytes
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.081875) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.1 rd, 206.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 11.1 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 5526, records dropped: 524 output_compression: NoCompression
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.081920) EVENT_LOG_v1 {"time_micros": 1769421533081904, "job": 18, "event": "compaction_finished", "compaction_time_micros": 66997, "compaction_time_cpu_micros": 31288, "output_level": 6, "num_output_files": 1, "total_output_size": 13831980, "num_input_records": 5526, "num_output_records": 5002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533082566, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421533084743, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:52.987774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-09:58:53.084813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 04:58:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:53 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:58:53 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:58:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:53 np0005595445 systemd[1]: Reloading.
Jan 26 04:58:53 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:58:53 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:58:53 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:58:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:58:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:58:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:58:53.920 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:58:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:54.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:54 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:58:54 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:58:54 np0005595445 systemd[1]: run-r61878c67b40549a48e0a13bfa14a249d.service: Deactivated successfully.
Jan 26 04:58:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:54.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:55 np0005595445 python3.9[205497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 04:58:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:56.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:56.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:56 np0005595445 python3.9[205651]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 04:58:57 np0005595445 python3.9[205808]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:58:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:58:57 np0005595445 python3.9[205932]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421536.6091857-505-153600535190746/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000070s ======
Jan 26 04:58:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:58:58.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000070s
Jan 26 04:58:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:58:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:58:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:58:58.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:58:58 np0005595445 podman[206056]: 2026-01-26 09:58:58.671581018 +0000 UTC m=+0.085916966 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 04:58:58 np0005595445 python3.9[206098]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:58:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:58:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:58:59 np0005595445 python3.9[206279]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:59:00 np0005595445 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 04:59:00 np0005595445 systemd[1]: Stopped Load Kernel Modules.
Jan 26 04:59:00 np0005595445 systemd[1]: Stopping Load Kernel Modules...
Jan 26 04:59:00 np0005595445 systemd[1]: Starting Load Kernel Modules...
Jan 26 04:59:00 np0005595445 systemd[1]: Finished Load Kernel Modules.
Jan 26 04:59:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:00.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:00.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:00 np0005595445 python3.9[206435]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:59:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:59:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:02.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:59:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:59:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:02.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:59:02 np0005595445 python3.9[206591]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:59:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:03 np0005595445 python3.9[206743]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:59:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:03 np0005595445 python3.9[206866]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421542.606024-658-96448631358218/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:59:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:04.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:59:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:04.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:04 np0005595445 python3.9[207019]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:59:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b300031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:05 np0005595445 python3.9[207172]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:06.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:06.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:06 np0005595445 python3.9[207325]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:07 np0005595445 python3.9[207477]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:08.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:59:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:08.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:59:08 np0005595445 python3.9[207630]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:08 np0005595445 python3.9[207782]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:09 np0005595445 python3.9[207935]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:10.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000023s ======
Jan 26 04:59:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:10.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Jan 26 04:59:10 np0005595445 python3.9[208087]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:11 np0005595445 python3.9[208239]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 04:59:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:12 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:12.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:12.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:12 np0005595445 python3.9[208394]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 04:59:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:13 np0005595445 python3.9[208547]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:13 np0005595445 systemd[1]: Listening on multipathd control socket.
Jan 26 04:59:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:14 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:14.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:14.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:14 np0005595445 python3.9[208704]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:14 np0005595445 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 04:59:14 np0005595445 udevadm[208709]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 04:59:14 np0005595445 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 04:59:14 np0005595445 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 04:59:14 np0005595445 multipathd[208712]: --------start up--------
Jan 26 04:59:14 np0005595445 multipathd[208712]: read /etc/multipath.conf
Jan 26 04:59:14 np0005595445 multipathd[208712]: path checkers start up
Jan 26 04:59:14 np0005595445 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 04:59:14 np0005595445 podman[208720]: 2026-01-26 09:59:14.804629225 +0000 UTC m=+0.109464713 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 04:59:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:16 np0005595445 python3.9[208899]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 04:59:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:16 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:16.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:16.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:16 np0005595445 python3.9[209051]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 04:59:16 np0005595445 kernel: Key type psk registered
Jan 26 04:59:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:17 np0005595445 python3.9[209214]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 04:59:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:18 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:18.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:18.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:18 np0005595445 python3.9[209338]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769421557.13052-1048-248105863941937/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:19 np0005595445 python3.9[209515]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:20 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:20.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:20.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:20 np0005595445 python3.9[209669]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:59:20 np0005595445 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 04:59:20 np0005595445 systemd[1]: Stopped Load Kernel Modules.
Jan 26 04:59:20 np0005595445 systemd[1]: Stopping Load Kernel Modules...
Jan 26 04:59:20 np0005595445 systemd[1]: Starting Load Kernel Modules...
Jan 26 04:59:20 np0005595445 systemd[1]: Finished Load Kernel Modules.
Jan 26 04:59:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:22 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:22.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:22.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:22 np0005595445 python3.9[209826]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 04:59:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:24 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:24.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:24 np0005595445 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 04:59:25 np0005595445 systemd[1]: Reloading.
Jan 26 04:59:25 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:59:25 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:59:25 np0005595445 systemd[1]: Reloading.
Jan 26 04:59:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095925 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 04:59:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:25 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:59:25 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:59:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:25 np0005595445 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 04:59:25 np0005595445 systemd-logind[783]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 04:59:25 np0005595445 systemd-logind[783]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 04:59:25 np0005595445 lvm[209947]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 04:59:25 np0005595445 lvm[209947]: VG ceph_vg0 finished
Jan 26 04:59:26 np0005595445 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 04:59:26 np0005595445 systemd[1]: Starting man-db-cache-update.service...
Jan 26 04:59:26 np0005595445 systemd[1]: Reloading.
Jan 26 04:59:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:26 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:26 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:59:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:26.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:26 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:59:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:26.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:26 np0005595445 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 04:59:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:27 np0005595445 python3.9[211187]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:59:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:27 np0005595445 systemd[1]: Stopping Open-iSCSI...
Jan 26 04:59:27 np0005595445 iscsid[204682]: iscsid shutting down.
Jan 26 04:59:27 np0005595445 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 04:59:27 np0005595445 systemd[1]: Stopped Open-iSCSI.
Jan 26 04:59:27 np0005595445 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 04:59:27 np0005595445 systemd[1]: Starting Open-iSCSI...
Jan 26 04:59:27 np0005595445 systemd[1]: Started Open-iSCSI.
Jan 26 04:59:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:28 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:28 np0005595445 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 04:59:28 np0005595445 systemd[1]: Finished man-db-cache-update.service.
Jan 26 04:59:28 np0005595445 systemd[1]: man-db-cache-update.service: Consumed 1.886s CPU time.
Jan 26 04:59:28 np0005595445 systemd[1]: run-ra0912b71835648ffa13a1280f2905ef2.service: Deactivated successfully.
Jan 26 04:59:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:28.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:28 np0005595445 python3.9[211457]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 04:59:28 np0005595445 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 04:59:28 np0005595445 multipathd[208712]: exit (signal)
Jan 26 04:59:28 np0005595445 multipathd[208712]: --------shut down-------
Jan 26 04:59:28 np0005595445 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 04:59:28 np0005595445 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 04:59:28 np0005595445 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 04:59:28 np0005595445 podman[211459]: 2026-01-26 09:59:28.970719313 +0000 UTC m=+0.074856301 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 04:59:29 np0005595445 multipathd[211483]: --------start up--------
Jan 26 04:59:29 np0005595445 multipathd[211483]: read /etc/multipath.conf
Jan 26 04:59:29 np0005595445 multipathd[211483]: path checkers start up
Jan 26 04:59:29 np0005595445 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 04:59:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:30 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:30 np0005595445 python3.9[211641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 04:59:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:30.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:31 np0005595445 python3.9[211797]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:32.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:32.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:32 np0005595445 python3.9[211950]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 04:59:32 np0005595445 systemd[1]: Reloading.
Jan 26 04:59:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:32 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 04:59:32 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 04:59:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 04:59:33 np0005595445 python3.9[212136]: ansible-ansible.builtin.service_facts Invoked
Jan 26 04:59:33 np0005595445 network[212153]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 04:59:33 np0005595445 network[212154]: 'network-scripts' will be removed from distribution in near future.
Jan 26 04:59:33 np0005595445 network[212155]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 04:59:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:34 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:34.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:34.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:36.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:36 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 04:59:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 04:59:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:59:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 04:59:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:37 np0005595445 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 04:59:37 np0005595445 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 04:59:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:38.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:38.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 04:59:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:40.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:40.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:41 np0005595445 python3.9[212565]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 04:59:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:42 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:42.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:42.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:42 np0005595445 python3.9[212719]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:43 np0005595445 python3.9[212874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:43 np0005595445 python3.9[213028]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:44 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:44.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:44.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:45 np0005595445 podman[213077]: 2026-01-26 09:59:45.362464126 +0000 UTC m=+0.131137573 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 04:59:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/095945 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 04:59:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b30002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:45 np0005595445 python3.9[213208]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:46.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000059s ======
Jan 26 04:59:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:46.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000059s
Jan 26 04:59:46 np0005595445 python3.9[213361]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:47 np0005595445 python3.9[213514]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:48.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:49 np0005595445 python3.9[213668]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 04:59:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:50.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:50 np0005595445 python3.9[213822]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:51 np0005595445 python3.9[213974]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:51 np0005595445 python3.9[214129]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000058s ======
Jan 26 04:59:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:52.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000058s
Jan 26 04:59:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:52.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:52 np0005595445 python3.9[214281]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:53 np0005595445 python3.9[214433]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.921 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 04:59:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 04:59:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 09:59:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 04:59:54 np0005595445 python3.9[214586]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:54.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:54.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:54 np0005595445 python3.9[214738]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:55 np0005595445 python3.9[214890]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:59:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:56.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:59:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:56 np0005595445 python3.9[215043]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:57 np0005595445 python3.9[215195]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 04:59:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:58 np0005595445 python3.9[215348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 04:59:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:09:59:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 04:59:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 04:59:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 04:59:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:09:59:58.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 04:59:58 np0005595445 python3.9[215500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:59 np0005595445 podman[215624]: 2026-01-26 09:59:59.263438291 +0000 UTC m=+0.048495525 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 04:59:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 04:59:59 np0005595445 python3.9[215671]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 04:59:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 09:59:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:00.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:00 np0005595445 python3.9[215849]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:00:00 np0005595445 ceph-mon[80107]: overall HEALTH_OK
Jan 26 05:00:00 np0005595445 python3.9[216001]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:00:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:01 np0005595445 python3.9[216153]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:00:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:00:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:02.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:00:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:02 np0005595445 python3.9[216306]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:03 np0005595445 python3.9[216459]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 05:00:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:04.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:04 np0005595445 python3.9[216611]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 05:00:04 np0005595445 systemd[1]: Reloading.
Jan 26 05:00:05 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 05:00:05 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 05:00:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:05 np0005595445 python3.9[216799]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.110889) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606110967, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 941, "num_deletes": 251, "total_data_size": 2175388, "memory_usage": 2212640, "flush_reason": "Manual Compaction"}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 26 05:00:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606126954, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1418430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19804, "largest_seqno": 20739, "table_properties": {"data_size": 1414143, "index_size": 2003, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9454, "raw_average_key_size": 19, "raw_value_size": 1405560, "raw_average_value_size": 2898, "num_data_blocks": 90, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421533, "oldest_key_time": 1769421533, "file_creation_time": 1769421606, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 16142 microseconds, and 8230 cpu microseconds.
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.127041) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1418430 bytes OK
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.127068) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128506) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128535) EVENT_LOG_v1 {"time_micros": 1769421606128528, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.128578) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2170674, prev total WAL file size 2170674, number of live WAL files 2.
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.129679) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1385KB)], [36(13MB)]
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606129773, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15250410, "oldest_snapshot_seqno": -1}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4971 keys, 13068717 bytes, temperature: kUnknown
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606207233, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13068717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13034209, "index_size": 20958, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126769, "raw_average_key_size": 25, "raw_value_size": 12942829, "raw_average_value_size": 2603, "num_data_blocks": 861, "num_entries": 4971, "num_filter_entries": 4971, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421606, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.207503) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13068717 bytes
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.208930) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 196.6 rd, 168.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 13.2 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(20.0) write-amplify(9.2) OK, records in: 5487, records dropped: 516 output_compression: NoCompression
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.208960) EVENT_LOG_v1 {"time_micros": 1769421606208947, "job": 20, "event": "compaction_finished", "compaction_time_micros": 77586, "compaction_time_cpu_micros": 24848, "output_level": 6, "num_output_files": 1, "total_output_size": 13068717, "num_input_records": 5487, "num_output_records": 4971, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606209509, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421606213869, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.129596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:00:06.213960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:00:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:06.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:06 np0005595445 python3.9[216952]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:07 np0005595445 python3.9[217105]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:07 np0005595445 python3.9[217259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:08.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:08.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:08 np0005595445 python3.9[217412]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:09 np0005595445 python3.9[217567]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:09 np0005595445 python3.9[217721]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:10.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:10.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:10 np0005595445 python3.9[217874]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 05:00:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:11 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:12 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:12.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:12.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:12 np0005595445 python3.9[218028]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:13 np0005595445 python3.9[218180]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20001510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:13 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:13 np0005595445 python3.9[218333]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:14 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:14.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:14.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:14 np0005595445 python3.9[218485]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:15 np0005595445 python3.9[218637]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:15 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:15 np0005595445 podman[218638]: 2026-01-26 10:00:15.672074828 +0000 UTC m=+0.173757128 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 26 05:00:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:16 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:16 np0005595445 python3.9[218817]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:16.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:16 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:16.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:16 np0005595445 python3.9[218970]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:17 np0005595445 python3.9[219122]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:17 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:18 np0005595445 python3.9[219275]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:18 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:18.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:18.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:18 np0005595445 python3.9[219427]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:19 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:20 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:20 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:20.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:20.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:21 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:00:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s#012Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 26 05:00:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:22 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000062s ======
Jan 26 05:00:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:22.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000062s
Jan 26 05:00:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:22 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:22.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100023 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:00:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:23 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:24 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:24.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:24 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:24.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:25 np0005595445 python3.9[219607]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 05:00:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:25 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:26 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:26 np0005595445 python3.9[219763]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 05:00:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:26.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:26 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:26.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:27 np0005595445 python3.9[219921]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 05:00:27 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:00:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:27 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:28 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:28 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:28.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:29 np0005595445 systemd-logind[783]: New session 54 of user zuul.
Jan 26 05:00:29 np0005595445 systemd[1]: Started Session 54 of User zuul.
Jan 26 05:00:29 np0005595445 systemd[1]: session-54.scope: Deactivated successfully.
Jan 26 05:00:29 np0005595445 systemd-logind[783]: Session 54 logged out. Waiting for processes to exit.
Jan 26 05:00:29 np0005595445 systemd-logind[783]: Removed session 54.
Jan 26 05:00:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:29 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b1c003430 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:29 np0005595445 podman[220084]: 2026-01-26 10:00:29.726212333 +0000 UTC m=+0.053086915 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 05:00:29 np0005595445 python3.9[220127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:30 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:30 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:30.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:30 np0005595445 python3.9[220250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421629.3873346-2655-38405105932395/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:30 np0005595445 python3.9[220402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:31 np0005595445 python3.9[220478]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:31 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:32 np0005595445 python3.9[220629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:32.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:32 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:32.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:32 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:00:32 np0005595445 python3.9[220750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421631.5555131-2655-189613474164202/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:33 np0005595445 python3.9[220900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24004140 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:33 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:33 np0005595445 python3.9[221022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421632.828948-2655-13822516807092/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:34 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:00:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:34 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:34.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:34.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:34 np0005595445 python3.9[221172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:34 np0005595445 python3.9[221293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421633.929032-2655-205495113716187/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:00:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:00:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:35 np0005595445 python3.9[221443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:35 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:35 np0005595445 python3.9[221567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421635.1166809-2655-211668518473945/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:36 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:36.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:36.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:37 np0005595445 python3.9[221719]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:00:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:37 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:38.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:38 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:00:38 np0005595445 python3.9[221872]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:00:39 np0005595445 python3.9[222024]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:00:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 05:00:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:39 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:40 np0005595445 python3.9[222202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:40 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:40.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:40 np0005595445 python3.9[222325]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769421639.5866902-2976-234382837844208/.source _original_basename=.dqxtjeui follow=False checksum=85aac6417ba27e9322cb5a036dc1fd0f95ce2de4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 05:00:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:41 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:41 np0005595445 python3.9[222598]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:00:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:42 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:42.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:42.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:00:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:42 np0005595445 python3.9[222781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100043 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:00:43 np0005595445 python3.9[222902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421642.1911008-3054-222265330781035/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:43 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:44 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:44.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:44 np0005595445 python3.9[223053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 05:00:45 np0005595445 python3.9[223174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769421644.2105863-3099-112455315419283/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 05:00:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:45 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:46 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000061s ======
Jan 26 05:00:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000061s
Jan 26 05:00:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:46.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:46 np0005595445 podman[223299]: 2026-01-26 10:00:46.32042889 +0000 UTC m=+0.126948190 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 05:00:46 np0005595445 python3.9[223346]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 05:00:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:47 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14002690 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:47 np0005595445 python3.9[223505]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 05:00:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:00:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:48 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:48.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:48.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:49 np0005595445 python3[223682]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 05:00:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:49 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:50 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:50.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:50.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:51 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ea0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:52 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:52.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:53 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.922 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:00:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.923 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:00:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:00:53.923 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:00:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:54 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ec0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:54.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:55 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:56 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:56.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003ee0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:57 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:00:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:58 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:00:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:00:58.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:00:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:00:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:00:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:00:58.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:00:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:00:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:00:59 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:00 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:00.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:00.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:01 np0005595445 podman[223791]: 2026-01-26 10:01:01.288894876 +0000 UTC m=+1.059005322 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 05:01:01 np0005595445 podman[223695]: 2026-01-26 10:01:01.399978048 +0000 UTC m=+12.270399925 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 05:01:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:01 np0005595445 podman[223836]: 2026-01-26 10:01:01.534455028 +0000 UTC m=+0.051911507 container create f8473b9a2106532a3892fa37e60a6e60a0eb71b6d886197e161aaec5fb0ab5d9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 05:01:01 np0005595445 podman[223836]: 2026-01-26 10:01:01.505002925 +0000 UTC m=+0.022459424 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 05:01:01 np0005595445 python3[223682]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 05:01:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:01 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:02 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f00 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:02.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:02.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:02 np0005595445 python3.9[224036]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:01:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:01:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:03 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:04 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004340 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:01:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:04.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:01:04 np0005595445 python3.9[224191]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 05:01:05 np0005595445 python3.9[224343]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 05:01:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f20 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:01:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3906 writes, 21K keys, 3906 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3906 writes, 3906 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1448 writes, 6861 keys, 1448 commit groups, 1.0 writes per commit group, ingest: 16.40 MB, 0.03 MB/s#012Interval WAL: 1448 writes, 1448 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    118.4      0.27              0.10        10    0.027       0      0       0.0       0.0#012  L6      1/0   12.46 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    157.9    134.3      0.85              0.25         9    0.095     43K   4846       0.0       0.0#012 Sum      1/0   12.46 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    119.4    130.5      1.13              0.35        19    0.059     43K   4846       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.3    135.1    135.5      0.47              0.16         8    0.059     22K   2570       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    157.9    134.3      0.85              0.25         9    0.095     43K   4846       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    119.5      0.27              0.10         9    0.030       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.032, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.1 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 8.34 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000192 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(461,7.96 MB,2.61993%) FilterBlock(19,127.67 KB,0.041013%) IndexBlock(19,252.55 KB,0.0811276%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 05:01:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:05 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b14003e10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:06 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:06.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:06.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:07 np0005595445 python3[224496]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 05:01:07 np0005595445 podman[224534]: 2026-01-26 10:01:07.309579739 +0000 UTC m=+0.057168022 container create e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 05:01:07 np0005595445 podman[224534]: 2026-01-26 10:01:07.280279749 +0000 UTC m=+0.027868042 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 26 05:01:07 np0005595445 python3[224496]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 26 05:01:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004360 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:07 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f40 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:01:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:08 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b24001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:08.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:08.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:08 np0005595445 python3.9[224728]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:01:09 np0005595445 python3.9[224882]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:01:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b18003bb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:09 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b20004360 fd 14 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:01:10 np0005595445 python3.9[225034]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769421669.5188947-3387-51787472195700/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 05:01:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[203352]: 26/01/2026 10:01:10 : epoch 69773ac9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7b3c003f40 fd 14 proxy ignored for local
Jan 26 05:01:10 np0005595445 kernel: ganesha.nfsd[204931]: segfault at 50 ip 00007f7bc5e6032e sp 00007f7b70ff8210 error 4 in libntirpc.so.5.8[7f7bc5e45000+2c000] likely on CPU 6 (core 0, socket 6)
Jan 26 05:01:10 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 05:01:10 np0005595445 systemd[1]: Started Process Core Dump (PID 225035/UID 0).
Jan 26 05:01:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:10.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:10.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:10 np0005595445 python3.9[225112]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 05:01:10 np0005595445 systemd[1]: Reloading.
Jan 26 05:01:10 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 05:01:10 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 05:01:11 np0005595445 systemd-coredump[225048]: Process 203358 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f7bc5e6032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 05:01:11 np0005595445 systemd[1]: systemd-coredump@7-225035-0.service: Deactivated successfully.
Jan 26 05:01:11 np0005595445 systemd[1]: systemd-coredump@7-225035-0.service: Consumed 1.190s CPU time.
Jan 26 05:01:11 np0005595445 podman[225230]: 2026-01-26 10:01:11.55532503 +0000 UTC m=+0.028854698 container died cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:01:11 np0005595445 systemd[1]: var-lib-containers-storage-overlay-18b268ab60ff9aff8b27d65bea4ca79576ef17e6e07f8e0cca83a86a786d46ca-merged.mount: Deactivated successfully.
Jan 26 05:01:11 np0005595445 podman[225230]: 2026-01-26 10:01:11.604426301 +0000 UTC m=+0.077955949 container remove cbf27e002fa6e7d390d75e124dda783fcdb8c722ca7000472b152c8413a63b8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:01:11 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 05:01:11 np0005595445 python3.9[225224]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 05:01:11 np0005595445 systemd[1]: Reloading.
Jan 26 05:01:11 np0005595445 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 05:01:11 np0005595445 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 05:01:12 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 05:01:12 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.924s CPU time.
Jan 26 05:01:12 np0005595445 systemd[1]: Starting nova_compute container...
Jan 26 05:01:12 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:01:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:12 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:12 np0005595445 podman[225312]: 2026-01-26 10:01:12.149597638 +0000 UTC m=+0.089145243 container init e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:01:12 np0005595445 podman[225312]: 2026-01-26 10:01:12.157688399 +0000 UTC m=+0.097235974 container start e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm)
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + sudo -E kolla_set_configs
Jan 26 05:01:12 np0005595445 podman[225312]: nova_compute
Jan 26 05:01:12 np0005595445 systemd[1]: Started nova_compute container.
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Validating config file
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying service configuration files
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Deleting /etc/ceph
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Creating directory /etc/ceph
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Writing out command to execute
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:12 np0005595445 nova_compute[225328]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 05:01:12 np0005595445 nova_compute[225328]: ++ cat /run_command
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + CMD=nova-compute
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + ARGS=
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + sudo kolla_copy_cacerts
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + [[ ! -n '' ]]
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + . kolla_extend_start
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 05:01:12 np0005595445 nova_compute[225328]: Running command: 'nova-compute'
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + umask 0022
Jan 26 05:01:12 np0005595445 nova_compute[225328]: + exec nova-compute
Jan 26 05:01:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:12.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:12.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:01:13 np0005595445 python3.9[225491]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.302 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.302 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.303 225332 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.303 225332 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 26 05:01:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:01:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:01:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:01:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:14 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:14.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.438 225332 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.462 225332 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:01:14 np0005595445 nova_compute[225328]: 2026-01-26 10:01:14.463 225332 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 05:01:14 np0005595445 python3.9[225645]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.152 225332 INFO nova.virt.driver [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.298 225332 INFO nova.compute.provider_config [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.321 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.322 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.323 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.324 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.325 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.326 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.327 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.328 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.329 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.330 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.331 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.332 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.333 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.334 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.335 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.336 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.337 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.338 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.339 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.340 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.341 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.342 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.343 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.344 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.345 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.346 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.347 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.348 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.349 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.350 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.351 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.352 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.353 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.354 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.355 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.356 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.357 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.358 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.359 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.360 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.361 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.362 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.363 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.364 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.365 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.366 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.367 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.368 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.369 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.370 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.371 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.372 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.373 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.374 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.375 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.376 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.377 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.378 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.379 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.380 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.381 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.382 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.383 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.384 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.385 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.386 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.387 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.388 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.389 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.390 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.391 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.392 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.393 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.394 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.395 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.396 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.397 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.398 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.399 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.400 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.401 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.402 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.403 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.404 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.405 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 WARNING oslo_config.cfg [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 05:01:15 np0005595445 nova_compute[225328]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 05:01:15 np0005595445 nova_compute[225328]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 05:01:15 np0005595445 nova_compute[225328]: and ``live_migration_inbound_addr`` respectively.
Jan 26 05:01:15 np0005595445 nova_compute[225328]: ).  Its value may be silently ignored in the future.#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.406 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.407 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.408 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_secret_uuid        = 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.409 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.410 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.411 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.412 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.413 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.414 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.415 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.416 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.417 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.418 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.419 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.420 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.421 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.422 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.423 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.424 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.425 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.426 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.427 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.428 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.429 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.430 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.431 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.432 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.433 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.434 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.435 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.436 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.437 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.438 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.439 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.440 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.441 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.442 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.443 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.444 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.445 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.446 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.447 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.448 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.449 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.450 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.451 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.452 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.453 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.454 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.455 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.456 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.457 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.458 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.459 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.460 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.461 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.462 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.463 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.464 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.465 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.466 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.467 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.468 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.469 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.470 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.471 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.472 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.473 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.474 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.475 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.476 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.477 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.478 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.479 225332 DEBUG oslo_service.service [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.481 225332 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.507 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.508 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 26 05:01:15 np0005595445 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 05:01:15 np0005595445 systemd[1]: Started libvirt QEMU daemon.
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.594 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f948768a1f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.598 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f948768a1f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.599 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.615 225332 WARNING nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 26 05:01:15 np0005595445 nova_compute[225328]: 2026-01-26 10:01:15.617 225332 DEBUG nova.virt.libvirt.volume.mount [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 26 05:01:15 np0005595445 python3.9[225836]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 05:01:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:01:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:16.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:16 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:16.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.541 225332 INFO nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <host>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <uuid>0657a708-098a-4137-a4d8-8ea25323424c</uuid>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <arch>x86_64</arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model>EPYC-Rome-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <vendor>AMD</vendor>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <microcode version='16777317'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <signature family='23' model='49' stepping='0'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='x2apic'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='tsc-deadline'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='osxsave'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='hypervisor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='tsc_adjust'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='spec-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='stibp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='arch-capabilities'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='cmp_legacy'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='topoext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='virt-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='lbrv'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='tsc-scale'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='vmcb-clean'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='pause-filter'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='pfthreshold'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='svme-addr-chk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='rdctl-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='skip-l1dfl-vmentry'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='mds-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature name='pschange-mc-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <pages unit='KiB' size='4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <pages unit='KiB' size='2048'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <pages unit='KiB' size='1048576'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <power_management>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <suspend_mem/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </power_management>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <iommu support='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <migration_features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <live/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <uri_transports>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <uri_transport>tcp</uri_transport>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <uri_transport>rdma</uri_transport>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </uri_transports>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </migration_features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <topology>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <cells num='1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <cell id='0'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <memory unit='KiB'>7864304</memory>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <pages unit='KiB' size='2048'>0</pages>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <distances>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <sibling id='0' value='10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          </distances>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          <cpus num='8'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:          </cpus>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        </cell>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </cells>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </topology>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <cache>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </cache>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <secmodel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model>selinux</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <doi>0</doi>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </secmodel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <secmodel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model>dac</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <doi>0</doi>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </secmodel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </host>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <guest>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <os_type>hvm</os_type>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <arch name='i686'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <wordsize>32</wordsize>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <domain type='qemu'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <domain type='kvm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <pae/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <nonpae/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <acpi default='on' toggle='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <apic default='on' toggle='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <cpuselection/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <deviceboot/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <disksnapshot default='on' toggle='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <externalSnapshot/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </guest>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <guest>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <os_type>hvm</os_type>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <arch name='x86_64'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <wordsize>64</wordsize>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <domain type='qemu'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <domain type='kvm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <acpi default='on' toggle='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <apic default='on' toggle='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <cpuselection/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <deviceboot/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <disksnapshot default='on' toggle='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <externalSnapshot/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </guest>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 
Jan 26 05:01:16 np0005595445 nova_compute[225328]: </capabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: #033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.549 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.574 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 05:01:16 np0005595445 nova_compute[225328]: <domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <domain>kvm</domain>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <arch>i686</arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <vcpu max='4096'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <iothreads supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <os supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='firmware'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <loader supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>rom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pflash</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='readonly'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>yes</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='secure'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </loader>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </os>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='maximumMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <vendor>AMD</vendor>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='succor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='custom' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <memoryBacking supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='sourceType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>anonymous</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>memfd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </memoryBacking>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <disk supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='diskDevice'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>disk</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cdrom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>floppy</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>lun</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>fdc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>sata</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </disk>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <graphics supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vnc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egl-headless</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </graphics>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <video supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='modelType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vga</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cirrus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>none</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>bochs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ramfb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </video>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hostdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='mode'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>subsystem</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='startupPolicy'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>mandatory</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>requisite</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>optional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='subsysType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pci</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='capsType'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='pciBackend'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hostdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <rng supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>random</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </rng>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <filesystem supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='driverType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>path</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>handle</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtiofs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </filesystem>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tpm supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-tis</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-crb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emulator</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>external</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendVersion'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>2.0</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </tpm>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <redirdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </redirdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <channel supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </channel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <crypto supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </crypto>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <interface supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>passt</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </interface>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <panic supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>isa</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>hyperv</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </panic>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <console supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>null</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dev</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pipe</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stdio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>udp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tcp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu-vdagent</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </console>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <gic supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <vmcoreinfo supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <genid supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backingStoreInput supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backup supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <async-teardown supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <s390-pv supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <ps2 supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tdx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sev supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sgx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hyperv supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='features'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>relaxed</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vapic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>spinlocks</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vpindex</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>runtime</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>synic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stimer</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reset</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vendor_id</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>frequencies</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reenlightenment</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tlbflush</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ipi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>avic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emsr_bitmap</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>xmm_input</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <spinlocks>4095</spinlocks>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <stimer_direct>on</stimer_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hyperv>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <launchSecurity supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: </domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.582 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 05:01:16 np0005595445 nova_compute[225328]: <domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <domain>kvm</domain>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <arch>i686</arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <vcpu max='240'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <iothreads supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <os supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='firmware'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <loader supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>rom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pflash</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='readonly'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>yes</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='secure'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </loader>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </os>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='maximumMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <vendor>AMD</vendor>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='succor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='custom' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 podman[225905]: 2026-01-26 10:01:16.656502787 +0000 UTC m=+0.101892322 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <memoryBacking supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='sourceType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>anonymous</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>memfd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </memoryBacking>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <disk supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='diskDevice'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>disk</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cdrom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>floppy</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>lun</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ide</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>fdc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>sata</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </disk>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <graphics supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vnc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egl-headless</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </graphics>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <video supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='modelType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vga</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cirrus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>none</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>bochs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ramfb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </video>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hostdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='mode'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>subsystem</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='startupPolicy'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>mandatory</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>requisite</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>optional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='subsysType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pci</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='capsType'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='pciBackend'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hostdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <rng supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>random</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </rng>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <filesystem supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='driverType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>path</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>handle</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtiofs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </filesystem>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tpm supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-tis</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-crb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emulator</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>external</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendVersion'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>2.0</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </tpm>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <redirdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </redirdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <channel supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </channel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <crypto supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </crypto>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <interface supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>passt</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </interface>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <panic supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>isa</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>hyperv</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </panic>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <console supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>null</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dev</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pipe</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stdio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>udp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tcp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu-vdagent</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </console>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <gic supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <vmcoreinfo supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <genid supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backingStoreInput supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backup supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <async-teardown supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <s390-pv supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <ps2 supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tdx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sev supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sgx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hyperv supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='features'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>relaxed</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vapic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>spinlocks</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vpindex</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>runtime</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>synic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stimer</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reset</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vendor_id</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>frequencies</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reenlightenment</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tlbflush</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ipi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>avic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emsr_bitmap</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>xmm_input</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <spinlocks>4095</spinlocks>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <stimer_direct>on</stimer_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hyperv>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <launchSecurity supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: </domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.638 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.642 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 05:01:16 np0005595445 nova_compute[225328]: <domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <domain>kvm</domain>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <arch>x86_64</arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <vcpu max='4096'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <iothreads supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <os supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='firmware'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>efi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <loader supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>rom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pflash</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='readonly'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>yes</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='secure'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>yes</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </loader>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </os>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='maximumMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <vendor>AMD</vendor>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='succor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='custom' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <memoryBacking supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='sourceType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>anonymous</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>memfd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </memoryBacking>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <disk supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='diskDevice'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>disk</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cdrom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>floppy</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>lun</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>fdc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>sata</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </disk>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <graphics supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vnc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egl-headless</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </graphics>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <video supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='modelType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vga</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cirrus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>none</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>bochs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ramfb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </video>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hostdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='mode'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>subsystem</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='startupPolicy'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>mandatory</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>requisite</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>optional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='subsysType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pci</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='capsType'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='pciBackend'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hostdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <rng supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>random</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </rng>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <filesystem supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='driverType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>path</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>handle</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtiofs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </filesystem>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tpm supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-tis</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-crb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emulator</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>external</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendVersion'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>2.0</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </tpm>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <redirdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </redirdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <channel supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </channel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <crypto supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </crypto>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <interface supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>passt</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </interface>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <panic supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>isa</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>hyperv</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </panic>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <console supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>null</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dev</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pipe</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stdio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>udp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tcp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu-vdagent</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </console>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <gic supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <vmcoreinfo supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <genid supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backingStoreInput supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backup supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <async-teardown supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <s390-pv supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <ps2 supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tdx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sev supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sgx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hyperv supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='features'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>relaxed</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vapic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>spinlocks</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vpindex</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>runtime</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>synic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stimer</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reset</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vendor_id</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>frequencies</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reenlightenment</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tlbflush</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ipi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>avic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emsr_bitmap</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>xmm_input</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <spinlocks>4095</spinlocks>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <stimer_direct>on</stimer_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hyperv>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <launchSecurity supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: </domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.740 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 05:01:16 np0005595445 nova_compute[225328]: <domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <domain>kvm</domain>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <arch>x86_64</arch>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <vcpu max='240'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <iothreads supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <os supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='firmware'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <loader supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>rom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pflash</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='readonly'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>yes</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='secure'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>no</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </loader>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </os>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='maximumMigratable'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>on</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>off</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <vendor>AMD</vendor>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='succor'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <mode name='custom' supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ddpd-u'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sha512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm3'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sm4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Denverton-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amd-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='auto-ibrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='perfmon-v2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbpb'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='stibp-always-on'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='EPYC-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-128'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-256'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx10-512'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='prefetchiti'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Haswell-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512er'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512pf'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fma4'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tbm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xop'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='amx-tile'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-bf16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-fp16'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bitalg'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrc'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fzrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='la57'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='taa-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ifma'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cmpccxadd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fbsdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='fsrs'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ibrs-all'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='intel-psfd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='lam'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mcdt-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pbrsb-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='psdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='serialize'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vaes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='hle'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='rtm'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512bw'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512cd'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512dq'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512f'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='avx512vl'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='invpcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pcid'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='pku'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='mpx'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='core-capability'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='split-lock-detect'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='cldemote'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='erms'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='gfni'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdir64b'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='movdiri'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='xsaves'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='athlon-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='core2duo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='coreduo-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='n270-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='ss'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <blockers model='phenom-v1'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnow'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <feature name='3dnowext'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </blockers>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </mode>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </cpu>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <memoryBacking supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <enum name='sourceType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>anonymous</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <value>memfd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </memoryBacking>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <disk supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='diskDevice'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>disk</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cdrom</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>floppy</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>lun</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ide</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>fdc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>sata</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </disk>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <graphics supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vnc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egl-headless</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </graphics>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <video supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='modelType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vga</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>cirrus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>none</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>bochs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ramfb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </video>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hostdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='mode'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>subsystem</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='startupPolicy'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>mandatory</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>requisite</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>optional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='subsysType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pci</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>scsi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='capsType'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='pciBackend'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hostdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <rng supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtio-non-transitional</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>random</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>egd</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </rng>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <filesystem supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='driverType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>path</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>handle</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>virtiofs</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </filesystem>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tpm supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-tis</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tpm-crb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emulator</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>external</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendVersion'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>2.0</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </tpm>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <redirdev supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='bus'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>usb</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </redirdev>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <channel supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </channel>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <crypto supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendModel'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>builtin</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </crypto>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <interface supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='backendType'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>default</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>passt</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </interface>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <panic supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='model'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>isa</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>hyperv</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </panic>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <console supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='type'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>null</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vc</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pty</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dev</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>file</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>pipe</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stdio</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>udp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tcp</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>unix</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>qemu-vdagent</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>dbus</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </console>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </devices>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  <features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <gic supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <vmcoreinfo supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <genid supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backingStoreInput supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <backup supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <async-teardown supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <s390-pv supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <ps2 supported='yes'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <tdx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sev supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <sgx supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <hyperv supported='yes'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <enum name='features'>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>relaxed</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vapic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>spinlocks</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vpindex</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>runtime</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>synic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>stimer</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reset</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>vendor_id</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>frequencies</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>reenlightenment</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>tlbflush</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>ipi</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>avic</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>emsr_bitmap</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <value>xmm_input</value>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </enum>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      <defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <spinlocks>4095</spinlocks>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <stimer_direct>on</stimer_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:      </defaults>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    </hyperv>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:    <launchSecurity supported='no'/>
Jan 26 05:01:16 np0005595445 nova_compute[225328]:  </features>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: </domainCapabilities>
Jan 26 05:01:16 np0005595445 nova_compute[225328]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.813 225332 DEBUG nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.813 225332 INFO nova.virt.libvirt.host [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Secure Boot support detected#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.815 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.815 225332 INFO nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.824 225332 DEBUG nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.866 225332 INFO nova.virt.node [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Determined node identity d06842a0-5d13-4573-bb78-d433bbb380e4 from /var/lib/nova/compute_id#033[00m
Jan 26 05:01:16 np0005595445 nova_compute[225328]: 2026-01-26 10:01:16.975 225332 WARNING nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Compute nodes ['d06842a0-5d13-4573-bb78-d433bbb380e4'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.007 225332 INFO nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.046 225332 WARNING nova.compute.manager [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.046 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.047 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.048 225332 DEBUG oslo_concurrency.processutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:01:17 np0005595445 python3.9[226039]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 05:01:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100117 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:01:17 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:01:17 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:01:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:01:17 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1791287281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.539 225332 DEBUG oslo_concurrency.processutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:01:17 np0005595445 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 05:01:17 np0005595445 systemd[1]: Started libvirt nodedev daemon.
Jan 26 05:01:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.846 225332 WARNING nova.virt.libvirt.driver [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.847 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5295MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.847 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.848 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.862 225332 WARNING nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] No compute node record for compute-1.ctlplane.example.com:d06842a0-5d13-4573-bb78-d433bbb380e4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host d06842a0-5d13-4573-bb78-d433bbb380e4 could not be found.#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.896 225332 INFO nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: d06842a0-5d13-4573-bb78-d433bbb380e4#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.965 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:01:17 np0005595445 nova_compute[225328]: 2026-01-26 10:01:17.965 225332 DEBUG nova.compute.resource_tracker [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:01:18 np0005595445 python3.9[226260]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 05:01:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:18.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:18.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:18 np0005595445 systemd[1]: Stopping nova_compute container...
Jan 26 05:01:18 np0005595445 nova_compute[225328]: 2026-01-26 10:01:18.458 225332 DEBUG oslo_concurrency.lockutils [None req-e28ab1db-9965-4b0c-8c27-8782a862545f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:01:18 np0005595445 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:01:18 np0005595445 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:01:18 np0005595445 nova_compute[225328]: 2026-01-26 10:01:18.459 225332 DEBUG oslo_concurrency.lockutils [None req-3828e663-e96b-4b8f-a5c9-21ecec929cc2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:01:18 np0005595445 virtqemud[225791]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 05:01:18 np0005595445 virtqemud[225791]: hostname: compute-1
Jan 26 05:01:18 np0005595445 virtqemud[225791]: End of file while reading data: Input/output error
Jan 26 05:01:18 np0005595445 systemd[1]: libpod-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6.scope: Deactivated successfully.
Jan 26 05:01:18 np0005595445 systemd[1]: libpod-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6.scope: Consumed 3.675s CPU time.
Jan 26 05:01:18 np0005595445 podman[226264]: 2026-01-26 10:01:18.87747575 +0000 UTC m=+0.481365578 container died e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 05:01:19 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6-userdata-shm.mount: Deactivated successfully.
Jan 26 05:01:19 np0005595445 systemd[1]: var-lib-containers-storage-overlay-bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a-merged.mount: Deactivated successfully.
Jan 26 05:01:19 np0005595445 podman[226264]: 2026-01-26 10:01:19.271857593 +0000 UTC m=+0.875747391 container cleanup e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:01:19 np0005595445 podman[226264]: nova_compute
Jan 26 05:01:19 np0005595445 podman[226294]: nova_compute
Jan 26 05:01:19 np0005595445 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 05:01:19 np0005595445 systemd[1]: Stopped nova_compute container.
Jan 26 05:01:19 np0005595445 systemd[1]: Starting nova_compute container...
Jan 26 05:01:19 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:01:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc1f2a961f3a254d4d6e926a9d6f4bf5de3b1a727c611bc615442b6a446b988a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:19 np0005595445 podman[226307]: 2026-01-26 10:01:19.464277644 +0000 UTC m=+0.090812809 container init e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 05:01:19 np0005595445 podman[226307]: 2026-01-26 10:01:19.472099978 +0000 UTC m=+0.098635123 container start e5e330867e7e5da6ef9c6b9af3e2d2c9ff3fe5beb198d40aa1d27e78e35090e6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + sudo -E kolla_set_configs
Jan 26 05:01:19 np0005595445 podman[226307]: nova_compute
Jan 26 05:01:19 np0005595445 systemd[1]: Started nova_compute container.
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Validating config file
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying service configuration files
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /etc/ceph
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Creating directory /etc/ceph
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Writing out command to execute
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 05:01:19 np0005595445 nova_compute[226322]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 05:01:19 np0005595445 nova_compute[226322]: ++ cat /run_command
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + CMD=nova-compute
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + ARGS=
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + sudo kolla_copy_cacerts
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + [[ ! -n '' ]]
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + . kolla_extend_start
Jan 26 05:01:19 np0005595445 nova_compute[226322]: Running command: 'nova-compute'
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + umask 0022
Jan 26 05:01:19 np0005595445 nova_compute[226322]: + exec nova-compute
Jan 26 05:01:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:20.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:20.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.515 226326 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.516 226326 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.650 226326 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.676 226326 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:01:21 np0005595445 nova_compute[226322]: 2026-01-26 10:01:21.676 226326 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.192 226326 INFO nova.virt.driver [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 26 05:01:22 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 8.
Jan 26 05:01:22 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:01:22 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.924s CPU time.
Jan 26 05:01:22 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.323 226326 INFO nova.compute.provider_config [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.332 226326 DEBUG oslo_concurrency.lockutils [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.333 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.334 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.335 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.336 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.337 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.338 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.339 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.340 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.341 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.342 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.343 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.344 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.345 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.346 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.347 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.348 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.349 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.350 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.351 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.352 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.353 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.354 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.355 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.356 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.357 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.358 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:01:22.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.359 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:01:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:01:22.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.360 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.361 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.362 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.363 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.364 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.365 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.366 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.367 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.368 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.369 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.370 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.371 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.372 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.373 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.374 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.375 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.376 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.377 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.378 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.379 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.380 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.381 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.382 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.383 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.384 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.385 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.386 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.387 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.388 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.389 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.390 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.391 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.392 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.393 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.394 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.395 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.396 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.397 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.398 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.399 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.400 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.401 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.402 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.403 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.404 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.405 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.406 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.407 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.408 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.409 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.410 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.411 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.412 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.413 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.414 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.415 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.416 226326 WARNING oslo_config.cfg [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 05:01:22 np0005595445 nova_compute[226322]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 05:01:22 np0005595445 nova_compute[226322]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 05:01:22 np0005595445 nova_compute[226322]: and ``live_migration_inbound_addr`` respectively.
Jan 26 05:01:22 np0005595445 nova_compute[226322]: ).  Its value may be silently ignored in the future.#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.417 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.418 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.419 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_secret_uuid        = 1a70b85d-e3fd-5814-8a6a-37ea00fcae30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.420 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.421 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.422 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.423 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.424 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.425 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.426 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.427 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.428 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.429 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.430 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.431 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.432 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.433 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.434 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.435 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.436 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.437 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.438 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.439 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.440 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.441 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.442 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.443 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.443 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.444 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.445 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.446 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.447 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.448 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.449 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.450 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.451 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.452 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.453 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.454 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.455 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.456 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.457 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.458 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.459 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.460 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.461 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.462 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.463 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.464 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.465 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.466 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.467 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.468 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.469 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.470 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.471 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.472 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.473 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.474 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.475 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.476 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.477 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.478 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.479 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.480 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.481 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.482 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.483 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.484 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.485 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.486 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.487 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.488 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.489 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.489 226326 DEBUG oslo_service.service [None req-078ee1ae-88cb-47ce-8613-169943e0360b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.490 226326 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 26 05:01:22 np0005595445 podman[226436]: 2026-01-26 10:01:22.498921 +0000 UTC m=+0.066627552 container create 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.513 226326 INFO nova.virt.node [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Determined node identity d06842a0-5d13-4573-bb78-d433bbb380e4 from /var/lib/nova/compute_id#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.514 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.515 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.528 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7faaed35dee0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.531 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7faaed35dee0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.531 226326 INFO nova.virt.libvirt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.538 226326 INFO nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <host>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <uuid>0657a708-098a-4137-a4d8-8ea25323424c</uuid>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <arch>x86_64</arch>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model>EPYC-Rome-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <vendor>AMD</vendor>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <microcode version='16777317'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <signature family='23' model='49' stepping='0'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='x2apic'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='tsc-deadline'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='osxsave'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='hypervisor'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='tsc_adjust'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='spec-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='stibp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='arch-capabilities'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='cmp_legacy'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='topoext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='virt-ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='lbrv'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='tsc-scale'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='vmcb-clean'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='pause-filter'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='pfthreshold'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='svme-addr-chk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='rdctl-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='skip-l1dfl-vmentry'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='mds-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature name='pschange-mc-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <pages unit='KiB' size='4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <pages unit='KiB' size='2048'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <pages unit='KiB' size='1048576'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <power_management>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <suspend_mem/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </power_management>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <iommu support='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <migration_features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <live/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <uri_transports>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <uri_transport>tcp</uri_transport>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <uri_transport>rdma</uri_transport>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </uri_transports>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </migration_features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <topology>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <cells num='1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <cell id='0'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <memory unit='KiB'>7864304</memory>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <pages unit='KiB' size='2048'>0</pages>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <distances>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <sibling id='0' value='10'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          </distances>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          <cpus num='8'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:          </cpus>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        </cell>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </cells>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </topology>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <cache>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </cache>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <secmodel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model>selinux</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <doi>0</doi>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </secmodel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <secmodel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model>dac</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <doi>0</doi>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </secmodel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </host>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <guest>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <os_type>hvm</os_type>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <arch name='i686'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <wordsize>32</wordsize>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <domain type='qemu'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <domain type='kvm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </arch>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <pae/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <nonpae/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <acpi default='on' toggle='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <apic default='on' toggle='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <cpuselection/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <deviceboot/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <disksnapshot default='on' toggle='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <externalSnapshot/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </guest>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <guest>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <os_type>hvm</os_type>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <arch name='x86_64'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <wordsize>64</wordsize>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <domain type='qemu'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <domain type='kvm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </arch>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <acpi default='on' toggle='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <apic default='on' toggle='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <cpuselection/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <deviceboot/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <disksnapshot default='on' toggle='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <externalSnapshot/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </guest>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 
Jan 26 05:01:22 np0005595445 nova_compute[226322]: </capabilities>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: #033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.544 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.546 226326 DEBUG nova.virt.libvirt.volume.mount [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.549 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 05:01:22 np0005595445 nova_compute[226322]: <domainCapabilities>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <domain>kvm</domain>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <arch>i686</arch>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <vcpu max='4096'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <iothreads supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <os supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <enum name='firmware'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <loader supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>rom</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pflash</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='readonly'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>yes</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>no</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='secure'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>no</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </loader>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>on</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>off</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='maximumMigratable'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>on</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>off</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <vendor>AMD</vendor>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:22 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='succor'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:22 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:22 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:22 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='custom' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ddpd-u'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sha512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ddpd-u'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sha512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton'>
Jan 26 05:01:22 np0005595445 bash[226436]: 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbpb'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbpb'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 podman[226436]: 2026-01-26 10:01:22.471028135 +0000 UTC m=+0.038734717 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-128'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-256'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 podman[226436]: 2026-01-26 10:01:22.571898772 +0000 UTC m=+0.139605344 container init 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-128'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-256'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 podman[226436]: 2026-01-26 10:01:22.577202603 +0000 UTC m=+0.144909155 container start 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:01:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='KnightsMill'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512er'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512pf'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512er'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512pf'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tbm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tbm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='athlon'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='athlon-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='core2duo'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='core2duo-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='coreduo'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='coreduo-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='n270'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='n270-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='phenom'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='phenom-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <memoryBacking supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <enum name='sourceType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>file</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>anonymous</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>memfd</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </memoryBacking>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <disk supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='diskDevice'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>disk</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>cdrom</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>floppy</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>lun</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='bus'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>fdc</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>scsi</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>usb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>sata</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-non-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <graphics supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vnc</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>egl-headless</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>dbus</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </graphics>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <video supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='modelType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vga</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>cirrus</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>none</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>bochs</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>ramfb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <hostdev supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='mode'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>subsystem</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='startupPolicy'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>default</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>mandatory</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>requisite</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>optional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='subsysType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>usb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pci</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>scsi</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='capsType'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='pciBackend'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </hostdev>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <rng supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-non-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='backendModel'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>random</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>egd</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>builtin</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <filesystem supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='driverType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>path</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>handle</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtiofs</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </filesystem>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <tpm supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>tpm-tis</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>tpm-crb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='backendModel'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>emulator</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>external</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='backendVersion'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>2.0</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </tpm>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <redirdev supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='bus'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>usb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </redirdev>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <channel supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pty</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>unix</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </channel>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <crypto supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>qemu</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='backendModel'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>builtin</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </crypto>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <interface supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='backendType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>default</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>passt</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <panic supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>isa</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>hyperv</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </panic>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <console supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>null</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vc</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pty</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>dev</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>file</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pipe</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>stdio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>udp</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>tcp</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>unix</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>qemu-vdagent</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>dbus</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </console>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <gic supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <vmcoreinfo supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <genid supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <backingStoreInput supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <backup supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <async-teardown supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <s390-pv supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <ps2 supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <tdx supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <sev supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <sgx supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <hyperv supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='features'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>relaxed</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vapic</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>spinlocks</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vpindex</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>runtime</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>synic</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>stimer</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>reset</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vendor_id</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>frequencies</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>reenlightenment</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>tlbflush</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>ipi</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>avic</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>emsr_bitmap</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>xmm_input</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <defaults>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <spinlocks>4095</spinlocks>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <stimer_direct>on</stimer_direct>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <tlbflush_direct>on</tlbflush_direct>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <tlbflush_extended>on</tlbflush_extended>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </defaults>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </hyperv>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <launchSecurity supported='no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: </domainCapabilities>
Jan 26 05:01:22 np0005595445 nova_compute[226322]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 26 05:01:22 np0005595445 nova_compute[226322]: 2026-01-26 10:01:22.561 226326 DEBUG nova.virt.libvirt.host [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 05:01:22 np0005595445 nova_compute[226322]: <domainCapabilities>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <path>/usr/libexec/qemu-kvm</path>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <domain>kvm</domain>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <arch>i686</arch>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <vcpu max='240'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <iothreads supported='yes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <os supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <enum name='firmware'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <loader supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>rom</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>pflash</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='readonly'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>yes</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>no</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='secure'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>no</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </loader>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='host-passthrough' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='hostPassthroughMigratable'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>on</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>off</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='maximum' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='maximumMigratable'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>on</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>off</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='host-model' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <vendor>AMD</vendor>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='x2apic'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc-deadline'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='hypervisor'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc_adjust'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='spec-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='stibp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='cmp_legacy'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='overflow-recov'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='succor'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='amd-ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='virt-ssbd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='lbrv'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='tsc-scale'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='vmcb-clean'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='flushbyasid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='pause-filter'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='pfthreshold'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='svme-addr-chk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <feature policy='disable' name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <mode name='custom' supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Broadwell-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cascadelake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='ClearwaterForest'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ddpd-u'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sha512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='ClearwaterForest-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ddpd-u'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sha512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm3'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sm4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Cooperlake-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Denverton-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Dhyana-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Genoa-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Milan-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Rome-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Turin'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbpb'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-Turin-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amd-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='auto-ibrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vp2intersect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fs-gs-base-ns'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibpb-brtype'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='no-nested-data-bp'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='null-sel-clr-base'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='perfmon-v2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbpb'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='srso-user-kernel-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='stibp-always-on'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='EPYC-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-128'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-256'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='GraniteRapids-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-128'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-256'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx10-512'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='prefetchiti'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Haswell-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-noTSX'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v6'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Icelake-Server-v7'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='IvyBridge-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='KnightsMill'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512er'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512pf'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='KnightsMill-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4fmaps'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-4vnniw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512er'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512pf'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G4-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tbm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Opteron_G5-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fma4'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tbm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xop'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SapphireRapids-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='amx-tile'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-bf16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-fp16'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512-vpopcntdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bitalg'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vbmi2'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrc'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fzrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='la57'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='taa-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='tsx-ldtrk'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='SierraForest-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ifma'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-ne-convert'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx-vnni-int8'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bhi-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='bus-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cmpccxadd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fbsdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='fsrs'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ibrs-all'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='intel-psfd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ipred-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='lam'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mcdt-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pbrsb-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='psdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rrsba-ctrl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='sbdr-ssdp-no'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='serialize'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vaes'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='vpclmulqdq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Client-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='hle'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='rtm'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Skylake-Server-v5'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512bw'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512cd'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512dq'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512f'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='avx512vl'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='invpcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pcid'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='pku'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='mpx'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v2'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v3'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='core-capability'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='split-lock-detect'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='Snowridge-v4'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='cldemote'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='erms'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='gfni'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdir64b'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='movdiri'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='xsaves'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='athlon'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='athlon-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='core2duo'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='core2duo-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='coreduo'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='coreduo-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='n270'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='n270-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='ss'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='phenom'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <blockers model='phenom-v1'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnow'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <feature name='3dnowext'/>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </blockers>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </mode>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <memoryBacking supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <enum name='sourceType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>file</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>anonymous</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <value>memfd</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  </memoryBacking>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <disk supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='diskDevice'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>disk</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>cdrom</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>floppy</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>lun</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='bus'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>ide</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>fdc</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>scsi</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>usb</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>sata</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='model'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio-non-transitional</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <graphics supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='type'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vnc</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>egl-headless</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>dbus</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      </enum>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    </graphics>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:    <video supported='yes'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:      <enum name='modelType'>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>vga</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>cirrus</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>virtio</value>
Jan 26 05:01:22 np0005595445 nova_compute[226322]:        <value>none</value>
Jan 26 05:02:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:02:48 np0005595445 rsyslogd[1005]: imjournal: 4831 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 26 05:02:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:48 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:48.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:02:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:48.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:02:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 26 05:02:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3978845484' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 26 05:02:49 np0005595445 podman[227225]: 2026-01-26 10:02:49.301493202 +0000 UTC m=+0.081643194 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:02:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b380016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:49 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c002c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:50 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:50.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:02:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:50.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:02:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:51 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:52 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:02:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:52.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:02:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:52.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:02:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:53 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:02:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:02:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:02:53.925 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:02:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:54 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:54.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:02:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:54.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:02:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:55 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:02:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:02:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:02:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:02:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:56 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:56.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:57 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:02:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:58 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:02:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:02:58.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:02:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:02:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:02:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:02:58.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:02:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:02:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:02:59 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:00 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:03:00 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:03:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:00.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:00.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:01 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:02 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:02.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:02.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:03 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:04 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:04.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:04.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:05 np0005595445 podman[227395]: 2026-01-26 10:03:05.268426207 +0000 UTC m=+0.044943605 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 05:03:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:05 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:06 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:06.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:06.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:07 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:08 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b38003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:08.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:08.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:09 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:10 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:10 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:10.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:11 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:12 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:12.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:13 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:14 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:14.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:14 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:14.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:15 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:16 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:16.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:16.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c003960 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:17 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:18 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:18.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:18 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:18.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:19 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:20 np0005595445 podman[227424]: 2026-01-26 10:03:20.318759564 +0000 UTC m=+0.093348527 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 05:03:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:20 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:20.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:20 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:20.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:21 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:22 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:22 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:22.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:22.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.505 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.527 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.528 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002690 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.707 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:03:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:23 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.983 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.984 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:03:23 np0005595445 nova_compute[226322]: 2026-01-26 10:03:23.984 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:03:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:24 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:24 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:03:24 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1818591514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.455 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:03:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:24 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.607 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5245MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.609 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.673 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.673 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:03:24 np0005595445 nova_compute[226322]: 2026-01-26 10:03:24.691 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:03:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:03:25 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487621882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:03:25 np0005595445 nova_compute[226322]: 2026-01-26 10:03:25.164 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:03:25 np0005595445 nova_compute[226322]: 2026-01-26 10:03:25.169 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:03:25 np0005595445 nova_compute[226322]: 2026-01-26 10:03:25.189 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:03:25 np0005595445 nova_compute[226322]: 2026-01-26 10:03:25.191 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:03:25 np0005595445 nova_compute[226322]: 2026-01-26 10:03:25.191 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:03:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:25 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:26 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:26 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:27 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:28 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:28.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:28.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:29 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:30 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:30.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:30.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:31 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:32 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:32.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:32.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b5c004a60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:33 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b58004410 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:34 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b40004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:03:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:34.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:34.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[226472]: 26/01/2026 10:03:35 : epoch 69773b72 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0b30002e50 fd 47 proxy ignored for local
Jan 26 05:03:35 np0005595445 kernel: ganesha.nfsd[227420]: segfault at 50 ip 00007f0be774532e sp 00007f0b6effc210 error 4 in libntirpc.so.5.8[7f0be772a000+2c000] likely on CPU 1 (core 0, socket 1)
Jan 26 05:03:35 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 05:03:35 np0005595445 systemd[1]: Started Process Core Dump (PID 227529/UID 0).
Jan 26 05:03:35 np0005595445 podman[227530]: 2026-01-26 10:03:35.685440751 +0000 UTC m=+0.090814902 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 05:03:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:36 np0005595445 systemd-coredump[227531]: Process 226476 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 61:#012#0  0x00007f0be774532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 05:03:36 np0005595445 systemd[1]: systemd-coredump@8-227529-0.service: Deactivated successfully.
Jan 26 05:03:36 np0005595445 systemd[1]: systemd-coredump@8-227529-0.service: Consumed 1.262s CPU time.
Jan 26 05:03:36 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:03:36 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:03:37 np0005595445 podman[227555]: 2026-01-26 10:03:37.011449083 +0000 UTC m=+0.026584871 container died 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:03:37 np0005595445 systemd[1]: var-lib-containers-storage-overlay-93dfa11a05619004f732a88ff930df4934d43a692e19d27b8c03adb369fa54e2-merged.mount: Deactivated successfully.
Jan 26 05:03:37 np0005595445 podman[227555]: 2026-01-26 10:03:37.056555899 +0000 UTC m=+0.071691707 container remove 3a6ae4dc2df3af4a6bf4e06b720b3b075b63d61c4adfe2b2c9d563de7c3cae7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 26 05:03:37 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 05:03:37 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 05:03:37 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.780s CPU time.
Jan 26 05:03:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:38 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:40.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100341 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:03:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:03:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:42.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:42 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:42.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:44.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:44.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:46.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:46.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.126357) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827126427, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 251, "total_data_size": 6264099, "memory_usage": 6368336, "flush_reason": "Manual Compaction"}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827150421, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4098766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20744, "largest_seqno": 23102, "table_properties": {"data_size": 4089234, "index_size": 6026, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19523, "raw_average_key_size": 20, "raw_value_size": 4070294, "raw_average_value_size": 4213, "num_data_blocks": 264, "num_entries": 966, "num_filter_entries": 966, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421607, "oldest_key_time": 1769421607, "file_creation_time": 1769421827, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 24252 microseconds, and 9785 cpu microseconds.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.150604) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4098766 bytes OK
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.150651) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152793) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152830) EVENT_LOG_v1 {"time_micros": 1769421827152819, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.152890) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6253727, prev total WAL file size 6253727, number of live WAL files 2.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.156145) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4002KB)], [39(12MB)]
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827156267, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17167483, "oldest_snapshot_seqno": -1}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5417 keys, 14966389 bytes, temperature: kUnknown
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827244244, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14966389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14927647, "index_size": 24104, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 136564, "raw_average_key_size": 25, "raw_value_size": 14827180, "raw_average_value_size": 2737, "num_data_blocks": 997, "num_entries": 5417, "num_filter_entries": 5417, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421827, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.244512) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14966389 bytes
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.245674) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.9 rd, 169.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(7.8) write-amplify(3.7) OK, records in: 5937, records dropped: 520 output_compression: NoCompression
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.245811) EVENT_LOG_v1 {"time_micros": 1769421827245683, "job": 22, "event": "compaction_finished", "compaction_time_micros": 88067, "compaction_time_cpu_micros": 31373, "output_level": 6, "num_output_files": 1, "total_output_size": 14966389, "num_input_records": 5937, "num_output_records": 5417, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827246798, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421827249465, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.156013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249513) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:03:47.249519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:03:47 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 9.
Jan 26 05:03:47 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:03:47 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.780s CPU time.
Jan 26 05:03:47 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 05:03:47 np0005595445 podman[227683]: 2026-01-26 10:03:47.678473191 +0000 UTC m=+0.043417490 container create da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Jan 26 05:03:47 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 05:03:47 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:03:47 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:03:47 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:03:47 np0005595445 podman[227683]: 2026-01-26 10:03:47.742671333 +0000 UTC m=+0.107615632 container init da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:03:47 np0005595445 podman[227683]: 2026-01-26 10:03:47.749099781 +0000 UTC m=+0.114044050 container start da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 05:03:47 np0005595445 bash[227683]: da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f
Jan 26 05:03:47 np0005595445 podman[227683]: 2026-01-26 10:03:47.657719489 +0000 UTC m=+0.022663778 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:03:47 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 05:03:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:03:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:48.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:48.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:50.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:50.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:51 np0005595445 podman[227741]: 2026-01-26 10:03:51.325466736 +0000 UTC m=+0.105832551 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 05:03:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:52.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 05:03:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:52.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 05:03:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:03:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:03:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:03:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.926 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:03:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:03:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:03:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:03:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:54.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:54.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100354 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:03:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:03:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:56.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:03:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:56.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:03:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:03:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:03:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:03:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:03:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:03:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:03:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:03:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:03:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:03:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:03:58.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:04:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:04:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:00.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:00.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:04:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:04:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:04:01 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:04:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 05:04:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:02.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 05:04:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:02.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:03 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.259 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:04:03 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.262 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:04:03 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:03.263 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:04:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:04.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:04.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:06 np0005595445 podman[227883]: 2026-01-26 10:04:06.317530882 +0000 UTC m=+0.083316611 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f034c000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 05:04:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:04:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:06.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:06.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:08 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:08.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:08.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100409 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:04:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:10 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03280016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:10.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:10.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100411 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:04:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:04:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:12 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:12.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:12.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:12 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:04:12 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:04:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 05:04:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:14.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 05:04:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:14.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:04:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:15 np0005595445 ceph-mon[80107]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 26 05:04:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:16 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:04:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:16.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:04:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:16.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:18 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:04:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:18.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:04:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:18.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:04:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:04:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:04:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:20.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:21 np0005595445 podman[227974]: 2026-01-26 10:04:21.68600953 +0000 UTC m=+0.071419848 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 05:04:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:22 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:04:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:22.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:04:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:22.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:04:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.170 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.171 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.192 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.192 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.193 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:04:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:24 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:24 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:04:24 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613830193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:04:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:24.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:24.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.647 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.803 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.806 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5235MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.806 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.807 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.870 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.870 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:04:24 np0005595445 nova_compute[226322]: 2026-01-26 10:04:24.884 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:04:25 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:04:25 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/456321413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.348 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.354 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.376 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.378 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.379 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:04:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340003ca0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.889 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.889 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.890 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.890 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.946 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.947 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:04:25 np0005595445 nova_compute[226322]: 2026-01-26 10:04:25.948 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:04:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 05:04:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:26.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 05:04:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:26.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100426 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:04:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:04:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:28 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:04:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:28.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:04:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:28.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:30 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 26 05:04:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:30.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 26 05:04:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:30.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100431 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:04:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:32 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:32.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 26 05:04:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:32.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 26 05:04:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:34 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:34.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:34.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f033c0022e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:36 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:36.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:36.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:37 np0005595445 podman[228059]: 2026-01-26 10:04:37.281616817 +0000 UTC m=+0.060102630 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 05:04:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:38.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:38.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:40 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:40.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:42 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:04:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:42.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:04:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:44.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:44.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:46 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:46.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:46.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:48 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:48.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:50 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:04:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:50.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:04:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:50.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:52 np0005595445 podman[228116]: 2026-01-26 10:04:52.376772619 +0000 UTC m=+0.153150873 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:04:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:52 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:52.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:52.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:04:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:04:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:04:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:04:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:54 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:54.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:54.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:56 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:56.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:56.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.083174) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898083245, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 906, "num_deletes": 250, "total_data_size": 1850403, "memory_usage": 1869800, "flush_reason": "Manual Compaction"}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898188522, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 793278, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23107, "largest_seqno": 24008, "table_properties": {"data_size": 789864, "index_size": 1194, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9210, "raw_average_key_size": 20, "raw_value_size": 782456, "raw_average_value_size": 1719, "num_data_blocks": 52, "num_entries": 455, "num_filter_entries": 455, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421828, "oldest_key_time": 1769421828, "file_creation_time": 1769421898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 105434 microseconds, and 4061 cpu microseconds.
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.188612) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 793278 bytes OK
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.188642) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266715) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266778) EVENT_LOG_v1 {"time_micros": 1769421898266766, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.266840) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1845815, prev total WAL file size 1847663, number of live WAL files 2.
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.267852) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(774KB)], [42(14MB)]
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898267979, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15759667, "oldest_snapshot_seqno": -1}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5383 keys, 12101845 bytes, temperature: kUnknown
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898337048, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12101845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12067130, "index_size": 20141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 136238, "raw_average_key_size": 25, "raw_value_size": 11971042, "raw_average_value_size": 2223, "num_data_blocks": 823, "num_entries": 5383, "num_filter_entries": 5383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.337292) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12101845 bytes
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.338565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 227.9 rd, 175.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.3 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(35.1) write-amplify(15.3) OK, records in: 5872, records dropped: 489 output_compression: NoCompression
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.338586) EVENT_LOG_v1 {"time_micros": 1769421898338577, "job": 24, "event": "compaction_finished", "compaction_time_micros": 69142, "compaction_time_cpu_micros": 29070, "output_level": 6, "num_output_files": 1, "total_output_size": 12101845, "num_input_records": 5872, "num_output_records": 5383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898338888, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421898341864, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.267677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:04:58.341999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:04:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:04:58.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:04:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:04:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:04:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:04:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:04:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:04:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:00.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:00.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:02 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c001920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:02.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:02.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0340004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:04 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:04.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 05:05:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:04.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 05:05:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:06.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:06.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:08 np0005595445 podman[228181]: 2026-01-26 10:05:08.269587234 +0000 UTC m=+0.052014781 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:05:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:08 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:08.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:08.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:09 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:10 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:10.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:10.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:11 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:11 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:12 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:12 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:12.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 05:05:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:12.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 05:05:13 np0005595445 podman[228323]: 2026-01-26 10:05:13.088309085 +0000 UTC m=+0.072197436 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True)
Jan 26 05:05:13 np0005595445 podman[228323]: 2026-01-26 10:05:13.219067429 +0000 UTC m=+0.202955770 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 05:05:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 05:05:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:13 np0005595445 podman[228465]: 2026-01-26 10:05:13.859281937 +0000 UTC m=+0.085819528 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:05:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:13 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:13 np0005595445 podman[228488]: 2026-01-26 10:05:13.926929953 +0000 UTC m=+0.052805845 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:05:14 np0005595445 podman[228465]: 2026-01-26 10:05:14.009456601 +0000 UTC m=+0.235994172 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:05:14 np0005595445 podman[228535]: 2026-01-26 10:05:14.207876212 +0000 UTC m=+0.046913380 container exec da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 05:05:14 np0005595445 podman[228535]: 2026-01-26 10:05:14.220175497 +0000 UTC m=+0.059212645 container exec_died da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:05:14 np0005595445 podman[228601]: 2026-01-26 10:05:14.453462732 +0000 UTC m=+0.099025380 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:05:14 np0005595445 podman[228601]: 2026-01-26 10:05:14.467368897 +0000 UTC m=+0.112931545 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:05:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:14 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:14 np0005595445 podman[228666]: 2026-01-26 10:05:14.66441029 +0000 UTC m=+0.054485947 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, version=2.2.4, distribution-scope=public, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Jan 26 05:05:14 np0005595445 podman[228666]: 2026-01-26 10:05:14.702064321 +0000 UTC m=+0.092139978 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, version=2.2.4, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, name=keepalived, description=keepalived for Ceph)
Jan 26 05:05:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:14.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 05:05:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:15 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:16 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:16.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 05:05:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:05:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:16 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:05:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:17 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:17 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:18 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:18.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c0036d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:19 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:20 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000005s ======
Jan 26 05:05:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:20.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000005s
Jan 26 05:05:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:20.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:21 np0005595445 nova_compute[226322]: 2026-01-26 10:05:21.741 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:05:21 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:21 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:22 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:22.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000006s ======
Jan 26 05:05:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:22.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000006s
Jan 26 05:05:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:23 np0005595445 podman[228834]: 2026-01-26 10:05:23.348687917 +0000 UTC m=+0.127335415 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:05:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:23 np0005595445 nova_compute[226322]: 2026-01-26 10:05:23.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:23 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:24 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:24 np0005595445 nova_compute[226322]: 2026-01-26 10:05:24.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:24 np0005595445 nova_compute[226322]: 2026-01-26 10:05:24.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:24.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.714 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:05:25 np0005595445 nova_compute[226322]: 2026-01-26 10:05:25.715 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:05:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:25 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:05:26 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1779943853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.161 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.346 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.348 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5200MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.348 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.349 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.434 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.435 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.451 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:05:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:26 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100526 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:05:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:26.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:05:26 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1283743844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.905 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.909 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.926 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.928 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:05:26 np0005595445 nova_compute[226322]: 2026-01-26 10:05:26.928 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:05:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:27 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:27 np0005595445 nova_compute[226322]: 2026-01-26 10:05:27.928 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:27 np0005595445 nova_compute[226322]: 2026-01-26 10:05:27.929 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:05:27 np0005595445 nova_compute[226322]: 2026-01-26 10:05:27.929 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:05:27 np0005595445 nova_compute[226322]: 2026-01-26 10:05:27.950 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:05:27 np0005595445 nova_compute[226322]: 2026-01-26 10:05:27.951 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:05:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:28 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:28.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:28.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03400046b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:29 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:30 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:30.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 05:05:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:30.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 05:05:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100531 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:05:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:31 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:32 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:05:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:32.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:05:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:33 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:34 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 05:05:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:34.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 05:05:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 05:05:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:34.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 05:05:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:05:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:35 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:36 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004440 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:36.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:36.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000062s ======
Jan 26 05:05:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - - [26/Jan/2026:10:05:36.980 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.002000062s
Jan 26 05:05:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:37 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:05:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:05:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:38 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:38.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 26 05:05:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:38.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 26 05:05:39 np0005595445 podman[228917]: 2026-01-26 10:05:39.307146172 +0000 UTC m=+0.082307901 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:05:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004460 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:39 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:40 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:40.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000058s ======
Jan 26 05:05:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000058s
Jan 26 05:05:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:05:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:05:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:41 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:42 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:42.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:42.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 26 05:05:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:43 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004480 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 26 05:05:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:05:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:05:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:05:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:44 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:05:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:44.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:44.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 26 05:05:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:45 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:46 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100546 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:05:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:46.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:46.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 26 05:05:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:05:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0344004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:47 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 26 05:05:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:48 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:48.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:48.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:49 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:50 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:50.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:50.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:51 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03500044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:52 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:52.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:52.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100553 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:05:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:53 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.927 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:05:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:05:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:05:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:05:54 np0005595445 podman[228971]: 2026-01-26 10:05:54.330616833 +0000 UTC m=+0.110988345 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:05:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:54 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004500 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:54.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:54.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:55 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:56 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:56.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:56.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:57 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:05:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:05:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:05:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3630123591' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:05:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:05:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:58 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:05:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:05:58.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:05:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:05:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:05:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:05:58.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:05:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:05:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:05:59 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0350004540 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:00 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:00 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:00.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:00.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:01 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:01 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:02 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:02 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:02.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:02.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f032c002930 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:03 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:04 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:06:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:06:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:06:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:04.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:06:05 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:05.265 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:06:05 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:05.266 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:06:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f03440041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:05 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:05 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0328004470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:06 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:06 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:06.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:06.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:07 np0005595445 kernel: ganesha.nfsd[228145]: segfault at 50 ip 00007f03d675032e sp 00007f03367fb210 error 4 in libntirpc.so.5.8[7f03d6735000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 05:06:07 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 05:06:07 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[227698]: 26/01/2026 10:06:07 : epoch 69773c03 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0320003c10 fd 38 proxy ignored for local
Jan 26 05:06:07 np0005595445 systemd[1]: Started Process Core Dump (PID 229036/UID 0).
Jan 26 05:06:08 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:08.268 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.348180) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968348402, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1058, "num_deletes": 255, "total_data_size": 2375899, "memory_usage": 2407856, "flush_reason": "Manual Compaction"}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968381347, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1564177, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24013, "largest_seqno": 25066, "table_properties": {"data_size": 1559303, "index_size": 2398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10405, "raw_average_key_size": 19, "raw_value_size": 1549381, "raw_average_value_size": 2853, "num_data_blocks": 105, "num_entries": 543, "num_filter_entries": 543, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421898, "oldest_key_time": 1769421898, "file_creation_time": 1769421968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 33196 microseconds, and 10369 cpu microseconds.
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.381389) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1564177 bytes OK
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.381408) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382807) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382822) EVENT_LOG_v1 {"time_micros": 1769421968382819, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.382838) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2370650, prev total WAL file size 2370650, number of live WAL files 2.
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.383534) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353032' seq:0, type:0; will stop at (end)
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1527KB)], [45(11MB)]
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968383584, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13666022, "oldest_snapshot_seqno": -1}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5394 keys, 13454390 bytes, temperature: kUnknown
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968456251, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13454390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13418066, "index_size": 21750, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 137634, "raw_average_key_size": 25, "raw_value_size": 13320220, "raw_average_value_size": 2469, "num_data_blocks": 888, "num_entries": 5394, "num_filter_entries": 5394, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769421968, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.456607) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13454390 bytes
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.463439) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.6 rd, 184.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.5 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(17.3) write-amplify(8.6) OK, records in: 5926, records dropped: 532 output_compression: NoCompression
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.463470) EVENT_LOG_v1 {"time_micros": 1769421968463456, "job": 26, "event": "compaction_finished", "compaction_time_micros": 72828, "compaction_time_cpu_micros": 24239, "output_level": 6, "num_output_files": 1, "total_output_size": 13454390, "num_input_records": 5926, "num_output_records": 5394, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968464155, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769421968468953, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.383459) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:06:08.469022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:06:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:08.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:08.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:08 np0005595445 systemd-coredump[229037]: Process 227702 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007f03d675032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012#1  0x0000000000000000 n/a (n/a + 0x0)#012#2  0x00007f03d675a900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)#012ELF object binary architecture: AMD x86-64
Jan 26 05:06:08 np0005595445 systemd[1]: systemd-coredump@9-229036-0.service: Deactivated successfully.
Jan 26 05:06:08 np0005595445 systemd[1]: systemd-coredump@9-229036-0.service: Consumed 1.214s CPU time.
Jan 26 05:06:08 np0005595445 podman[229042]: 2026-01-26 10:06:08.990773162 +0000 UTC m=+0.028927394 container died da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 26 05:06:09 np0005595445 systemd[1]: var-lib-containers-storage-overlay-388b2e91ee2994701e041fb7b4f25f4a9af1107644f6559cc4d916cf9618d53c-merged.mount: Deactivated successfully.
Jan 26 05:06:09 np0005595445 podman[229042]: 2026-01-26 10:06:09.145766336 +0000 UTC m=+0.183920548 container remove da36348c21b1c64499d70f03079a8c10f8292749eea8acac8020d0941757a01f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 26 05:06:09 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 05:06:09 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 05:06:09 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.745s CPU time.
Jan 26 05:06:10 np0005595445 podman[229087]: 2026-01-26 10:06:10.271429589 +0000 UTC m=+0.050698948 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 26 05:06:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:10.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:10.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:12.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:12.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100613 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:06:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:14.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:14.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:16.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:16.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:18.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:18.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:19 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 10.
Jan 26 05:06:19 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:06:19 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.745s CPU time.
Jan 26 05:06:19 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 05:06:19 np0005595445 podman[229160]: 2026-01-26 10:06:19.61563392 +0000 UTC m=+0.020793704 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:06:19 np0005595445 podman[229160]: 2026-01-26 10:06:19.776106385 +0000 UTC m=+0.181266129 container create 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 05:06:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 05:06:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:06:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:06:19 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:06:20 np0005595445 podman[229160]: 2026-01-26 10:06:20.22839519 +0000 UTC m=+0.633554974 container init 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:06:20 np0005595445 podman[229160]: 2026-01-26 10:06:20.233725377 +0000 UTC m=+0.638885131 container start 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 05:06:20 np0005595445 bash[229160]: 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca
Jan 26 05:06:20 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 05:06:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:20 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:06:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:20.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:20.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.714 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.715 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.715 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 05:06:21 np0005595445 nova_compute[226322]: 2026-01-26 10:06:21.737 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:06:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:06:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:06:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:06:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:22.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:22.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:24.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:24.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:25 np0005595445 podman[229327]: 2026-01-26 10:06:25.308504951 +0000 UTC m=+0.085670328 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.757 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:25 np0005595445 nova_compute[226322]: 2026-01-26 10:06:25.758 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:06:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:26 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:06:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:26 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.707 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.739 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:06:26 np0005595445 nova_compute[226322]: 2026-01-26 10:06:26.740 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:26.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:06:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:26.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:06:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:06:27 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1480749292' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.237 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.394 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.395 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5227MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.395 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.396 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.595 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.595 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.650 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.671 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.671 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.685 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.711 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:06:27 np0005595445 nova_compute[226322]: 2026-01-26 10:06:27.769 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:06:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:06:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:06:28 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4136502083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:06:28 np0005595445 nova_compute[226322]: 2026-01-26 10:06:28.258 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:28 np0005595445 nova_compute[226322]: 2026-01-26 10:06:28.265 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:06:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:28 np0005595445 nova_compute[226322]: 2026-01-26 10:06:28.334 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:06:28 np0005595445 nova_compute[226322]: 2026-01-26 10:06:28.335 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:06:28 np0005595445 nova_compute[226322]: 2026-01-26 10:06:28.336 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:28.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:28.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:29 np0005595445 nova_compute[226322]: 2026-01-26 10:06:29.316 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:06:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:30.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:30.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.322 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.323 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.340 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.433 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.434 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.439 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.439 226326 INFO nova.compute.claims [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 26 05:06:31 np0005595445 nova_compute[226322]: 2026-01-26 10:06:31.554 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:06:32 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3240052794' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.027 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.034 226326 DEBUG nova.compute.provider_tree [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.059 226326 DEBUG nova.scheduler.client.report [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.173 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.174 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.339 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.340 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.369 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.392 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.472 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.473 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.474 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating image(s)#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.502 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.541 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.572 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.575 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.576 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:32.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:32.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.878 226326 WARNING oslo_policy.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.879 226326 WARNING oslo_policy.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 26 05:06:32 np0005595445 nova_compute[226322]: 2026-01-26 10:06:32.881 226326 DEBUG nova.policy [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 05:06:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:32 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:06:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:33 np0005595445 nova_compute[226322]: 2026-01-26 10:06:33.303 226326 DEBUG nova.virt.libvirt.imagebackend [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image locations are: [{'url': 'rbd://1a70b85d-e3fd-5814-8a6a-37ea00fcae30/images/6789692f-fc1f-4efa-ae75-dcc13be695ef/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://1a70b85d-e3fd-5814-8a6a-37ea00fcae30/images/6789692f-fc1f-4efa-ae75-dcc13be695ef/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 26 05:06:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:33 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb314000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:33 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.171 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Successfully created port: 2a711799-8550-4432-83d1-93c9598eaa25 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.279 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.345 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.346 226326 DEBUG nova.virt.images [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] 6789692f-fc1f-4efa-ae75-dcc13be695ef was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.347 226326 DEBUG nova.privsep.utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.348 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.541 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.part /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.545 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:34 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.601 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.602 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.629 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:34 np0005595445 nova_compute[226322]: 2026-01-26 10:06:34.632 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:34.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:34.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 26 05:06:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100635 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:06:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:35 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:35 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb318001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:35 np0005595445 nova_compute[226322]: 2026-01-26 10:06:35.994 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Successfully updated port: 2a711799-8550-4432-83d1-93c9598eaa25 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.011 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.012 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.012 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.293 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 05:06:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:36 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:36.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.909 226326 DEBUG nova.compute.manager [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.910 226326 DEBUG nova.compute.manager [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:06:36 np0005595445 nova_compute[226322]: 2026-01-26 10:06:36.910 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:06:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 26 05:06:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:37 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:37 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.160 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.249 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 05:06:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:38 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.725 226326 DEBUG nova.objects.instance [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.743 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.743 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Ensure instance console log exists: /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:38 np0005595445 nova_compute[226322]: 2026-01-26 10:06:38.744 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:38.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:06:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:06:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:39 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.768 226326 DEBUG nova.network.neutron [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.787 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance network_info: |[{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.788 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.791 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start _get_guest_xml network_info=[{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.796 226326 WARNING nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.801 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.802 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.809 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.809 226326 DEBUG nova.virt.libvirt.host [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.810 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.811 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.812 226326 DEBUG nova.virt.hardware [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.815 226326 DEBUG nova.privsep.utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 26 05:06:39 np0005595445 nova_compute[226322]: 2026-01-26 10:06:39.816 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:39 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:06:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4246575752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.307 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.327 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.331 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:40 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:06:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3882837298' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.785 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.787 226326 DEBUG nova.virt.libvirt.vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:06:32Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.788 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.789 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.792 226326 DEBUG nova.objects.instance [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.816 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] End _get_guest_xml xml=<domain type="kvm">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <uuid>e89171d5-00d1-406a-bcc8-340bfdacbcbc</uuid>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <name>instance-00000001</name>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <memory>131072</memory>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <vcpu>1</vcpu>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <metadata>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:name>tempest-TestNetworkBasicOps-server-1457730655</nova:name>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:creationTime>2026-01-26 10:06:39</nova:creationTime>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:flavor name="m1.nano">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:memory>128</nova:memory>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:disk>1</nova:disk>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:swap>0</nova:swap>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:vcpus>1</nova:vcpus>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </nova:flavor>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:owner>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </nova:owner>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <nova:ports>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <nova:port uuid="2a711799-8550-4432-83d1-93c9598eaa25">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        </nova:port>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </nova:ports>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </nova:instance>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </metadata>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <sysinfo type="smbios">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <system>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="manufacturer">RDO</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="product">OpenStack Compute</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="serial">e89171d5-00d1-406a-bcc8-340bfdacbcbc</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="uuid">e89171d5-00d1-406a-bcc8-340bfdacbcbc</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <entry name="family">Virtual Machine</entry>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </system>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </sysinfo>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <os>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <boot dev="hd"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <smbios mode="sysinfo"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <acpi/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <apic/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <vmcoreinfo/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <clock offset="utc">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <timer name="hpet" present="no"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </clock>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <cpu mode="host-model" match="exact">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <disk type="network" device="disk">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <target dev="vda" bus="virtio"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <disk type="network" device="cdrom">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <target dev="sda" bus="sata"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <interface type="ethernet">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <mac address="fa:16:3e:30:cd:f5"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <mtu size="1442"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <target dev="tap2a711799-85"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <serial type="pty">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <log file="/var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/console.log" append="off"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </serial>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <video>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <input type="tablet" bus="usb"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <rng model="virtio">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <backend model="random">/dev/urandom</backend>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <controller type="usb" index="0"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    <memballoon model="virtio">
Jan 26 05:06:40 np0005595445 nova_compute[226322]:      <stats period="10"/>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:    </memballoon>
Jan 26 05:06:40 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:06:40 np0005595445 nova_compute[226322]: </domain>
Jan 26 05:06:40 np0005595445 nova_compute[226322]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.818 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Preparing to wait for external event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.819 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.820 226326 DEBUG nova.virt.libvirt.vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:06:32Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.820 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.821 226326 DEBUG nova.network.os_vif_util [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.821 226326 DEBUG os_vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 05:06:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:40.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:40.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.866 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.867 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.867 226326 DEBUG ovsdbapp.backend.ovs_idl [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.869 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.886 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:06:40 np0005595445 nova_compute[226322]: 2026-01-26 10:06:40.887 226326 INFO oslo.privsep.daemon [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpjhe3kfhi/privsep.sock']#033[00m
Jan 26 05:06:41 np0005595445 podman[229714]: 2026-01-26 10:06:41.310459129 +0000 UTC m=+0.083383460 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.577 226326 INFO oslo.privsep.daemon [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.455 229734 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.459 229734 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.461 229734 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.461 229734 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229734#033[00m
Jan 26 05:06:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:41 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.908 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.909 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a711799-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.909 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a711799-85, col_values=(('external_ids', {'iface-id': '2a711799-8550-4432-83d1-93c9598eaa25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:cd:f5', 'vm-uuid': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:41 np0005595445 NetworkManager[49073]: <info>  [1769422001.9130] manager: (tap2a711799-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.913 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.921 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.922 226326 INFO os_vif [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85')#033[00m
Jan 26 05:06:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:41 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.981 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:30:cd:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 05:06:41 np0005595445 nova_compute[226322]: 2026-01-26 10:06:41.982 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Using config drive#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.007 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.317 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.318 226326 DEBUG nova.network.neutron [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.336 226326 DEBUG oslo_concurrency.lockutils [req-56d9524d-3235-4684-af56-0b64b5c454a4 req-ae65cc06-2009-4386-9777-60979fa86b37 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.494 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Creating config drive at /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.498 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp16dhfavv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:42 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.628 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp16dhfavv" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.659 226326 DEBUG nova.storage.rbd_utils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.663 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.792 226326 DEBUG oslo_concurrency.processutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config e89171d5-00d1-406a-bcc8-340bfdacbcbc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.793 226326 INFO nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deleting local config drive /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc/disk.config because it was imported into RBD.#033[00m
Jan 26 05:06:42 np0005595445 systemd[1]: Starting libvirt secret daemon...
Jan 26 05:06:42 np0005595445 systemd[1]: Started libvirt secret daemon.
Jan 26 05:06:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:42.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:06:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:06:42 np0005595445 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 05:06:42 np0005595445 kernel: tap2a711799-85: entered promiscuous mode
Jan 26 05:06:42 np0005595445 NetworkManager[49073]: <info>  [1769422002.8971] manager: (tap2a711799-85): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 05:06:42 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:42Z|00027|binding|INFO|Claiming lport 2a711799-8550-4432-83d1-93c9598eaa25 for this chassis.
Jan 26 05:06:42 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:42Z|00028|binding|INFO|2a711799-8550-4432-83d1-93c9598eaa25: Claiming fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.899 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.903 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:42 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.923 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:cd:f5 10.100.0.8'], port_security=['fa:16:3e:30:cd:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9f16b9a-82ea-4f86-8e79-c292f5efcb52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aeef230-e621-407c-aaae-8f0a628ce92a, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=2a711799-8550-4432-83d1-93c9598eaa25) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:06:42 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.924 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 2a711799-8550-4432-83d1-93c9598eaa25 in datapath 7939c8f9-060e-41e0-9e41-ebecf5f62dfb bound to our chassis#033[00m
Jan 26 05:06:42 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.926 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7939c8f9-060e-41e0-9e41-ebecf5f62dfb#033[00m
Jan 26 05:06:42 np0005595445 systemd-udevd[229860]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:06:42 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:42.928 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpgkojd4t6/privsep.sock']#033[00m
Jan 26 05:06:42 np0005595445 systemd-machined[194876]: New machine qemu-1-instance-00000001.
Jan 26 05:06:42 np0005595445 NetworkManager[49073]: <info>  [1769422002.9602] device (tap2a711799-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 05:06:42 np0005595445 NetworkManager[49073]: <info>  [1769422002.9610] device (tap2a711799-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 05:06:42 np0005595445 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.982 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:42 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:42Z|00029|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 ovn-installed in OVS
Jan 26 05:06:42 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:42Z|00030|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 up in Southbound
Jan 26 05:06:42 np0005595445 nova_compute[226322]: 2026-01-26 10:06:42.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 26 05:06:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.601 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.6012678, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.603 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Started (Lifecycle Event)#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.621 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.621 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgkojd4t6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.492 229912 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.496 229912 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.498 229912 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.498 229912 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229912#033[00m
Jan 26 05:06:43 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:43.624 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[de62acc8-a100-475c-9a1e-cad0993d7715]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.636 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.640 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.6014047, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.640 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Paused (Lifecycle Event)#033[00m
Jan 26 05:06:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:43 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.655 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.658 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.669 226326 DEBUG nova.compute.manager [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG oslo_concurrency.lockutils [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.670 226326 DEBUG nova.compute.manager [req-b9b40dd8-e449-4f14-aea2-1bbc655792ab req-6d5fd51a-6c7a-44c1-849b-9b3133aa086f b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Processing event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.671 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.676 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.683 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.684 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422003.68418, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.684 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Resumed (Lifecycle Event)#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.688 226326 INFO nova.virt.libvirt.driver [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance spawned successfully.#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.688 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.706 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.710 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.732 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.739 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.740 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.740 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.741 226326 DEBUG nova.virt.libvirt.driver [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.788 226326 INFO nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 11.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.789 226326 DEBUG nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.859 226326 INFO nova.compute.manager [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 12.45 seconds to build instance.#033[00m
Jan 26 05:06:43 np0005595445 nova_compute[226322]: 2026-01-26 10:06:43.876 226326 DEBUG oslo_concurrency.lockutils [None req-fdbf126d-6568-4e8b-abdc-4acb8bceaca9 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:43 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:44 np0005595445 nova_compute[226322]: 2026-01-26 10:06:44.118 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.179 229912 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.180 229912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.180 229912 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:44 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.820 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf68adb-7f57-4cfc-b08e-eb481667f189]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.821 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7939c8f9-01 in ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.822 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7939c8f9-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.823 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[45167cc3-7ace-4a58-9942-18db37853d3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.825 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd11499a-2f89-498c-8eda-303f3c41bb94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:44.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.855 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[8d897bd8-ed4e-4593-9d89-014b494662e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.875 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ac74a3b3-a603-486a-b537-551e4786d458]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:44 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:44.877 143326 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpe_q9dabc/privsep.sock']#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.596 143326 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.597 143326 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpe_q9dabc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.475 229936 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.482 229936 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.485 229936 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.486 229936 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229936#033[00m
Jan 26 05:06:45 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:45.600 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[a8116edf-623e-4d9b-ab63-1d54fa580f66]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:45 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.848 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 DEBUG oslo_concurrency.lockutils [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 DEBUG nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.849 226326 WARNING nova.compute.manager [req-9313bba1-32d9-41e4-970f-3d059f719f73 req-78b82fb4-0509-4f88-84e5-469c25415b92 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received unexpected event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with vm_state active and task_state None.#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.894 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8951] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8960] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <warn>  [1769422005.8961] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8972] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8978] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <warn>  [1769422005.8978] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8989] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.8998] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.9003] device (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 05:06:45 np0005595445 NetworkManager[49073]: <info>  [1769422005.9009] device (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.929 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:45 np0005595445 nova_compute[226322]: 2026-01-26 10:06:45.933 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:45 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.184 229936 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:46 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.783 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0dda9a-a39d-49ce-a46b-cfcb9ee67f54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 NetworkManager[49073]: <info>  [1769422006.8067] manager: (tap7939c8f9-00): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.807 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[5c977bfe-dd9e-43c3-8015-9486fb2a6263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 systemd-udevd[229950]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.836 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c57640-3e17-4625-a24c-b497b1f4d87d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.839 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4235c3-33c7-42e2-b36a-6fbb57bd5614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:46.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:46 np0005595445 NetworkManager[49073]: <info>  [1769422006.8630] device (tap7939c8f9-00): carrier: link connected
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.866 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[435e14ff-4a25-48ca-809e-ba4741724ce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.885 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[87e88ad4-3c08-4e9b-ba15-76e6e525ff26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7939c8f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:20:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396742, 'reachable_time': 15416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229968, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.897 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[17130bb0-c757-47dc-ac66-387d30980e07]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:202b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396742, 'tstamp': 396742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229969, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.910 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0fd2b8-085f-43af-9b18-716c0a59a7ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7939c8f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:20:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396742, 'reachable_time': 15416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229970, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 nova_compute[226322]: 2026-01-26 10:06:46.912 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.938 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[57d4e07b-64ed-4309-a48f-511086149fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.990 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5b754e-3f6e-41d8-9345-f922754f7ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.991 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7939c8f9-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.992 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.992 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7939c8f9-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:46 np0005595445 kernel: tap7939c8f9-00: entered promiscuous mode
Jan 26 05:06:46 np0005595445 nova_compute[226322]: 2026-01-26 10:06:46.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:46 np0005595445 nova_compute[226322]: 2026-01-26 10:06:46.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:46 np0005595445 NetworkManager[49073]: <info>  [1769422006.9968] manager: (tap7939c8f9-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 05:06:46 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:46.997 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7939c8f9-00, col_values=(('external_ids', {'iface-id': '87777a78-2fc8-466a-8f48-d1f2b973e0a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:06:46 np0005595445 nova_compute[226322]: 2026-01-26 10:06:46.998 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:46 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:46Z|00031|binding|INFO|Releasing lport 87777a78-2fc8-466a-8f48-d1f2b973e0a9 from this chassis (sb_readonly=0)
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.011 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.012 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.013 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[89f19bcc-da54-4725-ad61-2f7b8a84206b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.016 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: global
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    log         /dev/log local0 debug
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    log-tag     haproxy-metadata-proxy-7939c8f9-060e-41e0-9e41-ebecf5f62dfb
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    user        root
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    group       root
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    maxconn     1024
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    pidfile     /var/lib/neutron/external/pids/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.pid.haproxy
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    daemon
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: defaults
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    log global
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    mode http
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    option httplog
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    option dontlognull
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    option http-server-close
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    option forwardfor
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    retries                 3
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    timeout http-request    30s
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    timeout connect         30s
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    timeout client          32s
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    timeout server          32s
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    timeout http-keep-alive 30s
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: listen listener
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    bind 169.254.169.254:80
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]:    http-request add-header X-OVN-Network-ID 7939c8f9-060e-41e0-9e41-ebecf5f62dfb
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 05:06:47 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:47.017 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'env', 'PROCESS_TAG=haproxy-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7939c8f9-060e-41e0-9e41-ebecf5f62dfb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 05:06:47 np0005595445 podman[230003]: 2026-01-26 10:06:47.400883647 +0000 UTC m=+0.048158535 container create 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 26 05:06:47 np0005595445 systemd[1]: Started libpod-conmon-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope.
Jan 26 05:06:47 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:06:47 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6308204caa6ef9ef44a497d3a0e6f90a3cb0448655da17487cc7918118dc819d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 05:06:47 np0005595445 podman[230003]: 2026-01-26 10:06:47.376278111 +0000 UTC m=+0.023553019 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 05:06:47 np0005595445 podman[230003]: 2026-01-26 10:06:47.485231183 +0000 UTC m=+0.132506071 container init 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 05:06:47 np0005595445 podman[230003]: 2026-01-26 10:06:47.491938246 +0000 UTC m=+0.139213124 container start 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:06:47 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : New worker (230026) forked
Jan 26 05:06:47 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : Loading success.
Jan 26 05:06:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:47 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.937 226326 DEBUG nova.compute.manager [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.937 226326 DEBUG nova.compute.manager [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:06:47 np0005595445 nova_compute[226322]: 2026-01-26 10:06:47.938 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:06:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:47 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:48 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100648 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:06:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:06:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:48.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:06:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:48.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:49 np0005595445 nova_compute[226322]: 2026-01-26 10:06:49.119 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:49 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:49 np0005595445 nova_compute[226322]: 2026-01-26 10:06:49.824 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:06:49 np0005595445 nova_compute[226322]: 2026-01-26 10:06:49.825 226326 DEBUG nova.network.neutron [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:06:49 np0005595445 nova_compute[226322]: 2026-01-26 10:06:49.841 226326 DEBUG oslo_concurrency.lockutils [req-d0e10174-ddc4-455a-94fb-0cfe24442373 req-43421b70-be97-4c90-acab-8088287865e6 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:06:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:49 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:50 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:06:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:50.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:06:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:06:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:50.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:06:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:51 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:51 np0005595445 nova_compute[226322]: 2026-01-26 10:06:51.958 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:51 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:52 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:52 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:52.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:52.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:53 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.928 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:06:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.929 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:06:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:06:53.930 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:06:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:53 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:54 np0005595445 nova_compute[226322]: 2026-01-26 10:06:54.122 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:54 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:54.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:54.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:55 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb30c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:55 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:55 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:56 np0005595445 podman[230044]: 2026-01-26 10:06:56.306834439 +0000 UTC m=+0.089578523 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 05:06:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:56 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2f0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:06:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:56.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:56.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:56 np0005595445 nova_compute[226322]: 2026-01-26 10:06:56.960 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:57 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:57Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 05:06:57 np0005595445 ovn_controller[133670]: 2026-01-26T10:06:57Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:cd:f5 10.100.0.8
Jan 26 05:06:57 np0005595445 kernel: ganesha.nfsd[229507]: segfault at 50 ip 00007fb39e93f32e sp 00007fb349ffa210 error 4 in libntirpc.so.5.8[7fb39e924000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 26 05:06:57 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 05:06:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[229176]: 26/01/2026 10:06:57 : epoch 69773c9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb3180034e0 fd 38 proxy ignored for local
Jan 26 05:06:57 np0005595445 systemd[1]: Started Process Core Dump (PID 230072/UID 0).
Jan 26 05:06:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:06:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:06:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:06:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156652848' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:06:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:06:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:06:58.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:06:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:06:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:06:58.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:06:59 np0005595445 systemd-coredump[230073]: Process 229180 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007fb39e93f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 05:06:59 np0005595445 nova_compute[226322]: 2026-01-26 10:06:59.124 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:06:59 np0005595445 systemd[1]: systemd-coredump@10-230072-0.service: Deactivated successfully.
Jan 26 05:06:59 np0005595445 systemd[1]: systemd-coredump@10-230072-0.service: Consumed 1.149s CPU time.
Jan 26 05:06:59 np0005595445 podman[230078]: 2026-01-26 10:06:59.281069933 +0000 UTC m=+0.031726659 container died 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 05:06:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay-6018c0b2554aae38b16b4fd14d15b48c1e664305ff0d647a5d04a7a440271808-merged.mount: Deactivated successfully.
Jan 26 05:06:59 np0005595445 podman[230078]: 2026-01-26 10:06:59.322004703 +0000 UTC m=+0.072661449 container remove 926ab88f4fc917023edfabab998a063a7c947db60c118100bca1f0cd441e3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 26 05:06:59 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 05:06:59 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 05:06:59 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.268s CPU time.
Jan 26 05:07:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:00.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:01 np0005595445 nova_compute[226322]: 2026-01-26 10:07:01.962 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:07:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:02.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:07:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:03 np0005595445 nova_compute[226322]: 2026-01-26 10:07:03.503 226326 INFO nova.compute.manager [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Get console output#033[00m
Jan 26 05:07:03 np0005595445 nova_compute[226322]: 2026-01-26 10:07:03.508 226326 INFO oslo.privsep.daemon [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp2dcbnn_x/privsep.sock']#033[00m
Jan 26 05:07:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100703 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.177 226326 INFO oslo.privsep.daemon [None req-c917550b-5647-41c7-968f-6a085006bfdc c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.045 230154 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.051 230154 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.055 230154 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.055 230154 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230154#033[00m
Jan 26 05:07:04 np0005595445 nova_compute[226322]: 2026-01-26 10:07:04.268 230154 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 05:07:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:04.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:04.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:06.629 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:07:06 np0005595445 nova_compute[226322]: 2026-01-26 10:07:06.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:06.630 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:07:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999998s ======
Jan 26 05:07:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:06.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999998s
Jan 26 05:07:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:06.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:06 np0005595445 nova_compute[226322]: 2026-01-26 10:07:06.965 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:07:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:08.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:07:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:09 np0005595445 nova_compute[226322]: 2026-01-26 10:07:09.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:09 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 11.
Jan 26 05:07:09 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:07:09 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.268s CPU time.
Jan 26 05:07:09 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 05:07:09 np0005595445 podman[230212]: 2026-01-26 10:07:09.980772449 +0000 UTC m=+0.060574542 container create 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True)
Jan 26 05:07:10 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 05:07:10 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:07:10 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:07:10 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:07:10 np0005595445 podman[230212]: 2026-01-26 10:07:10.037876652 +0000 UTC m=+0.117678795 container init 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 26 05:07:10 np0005595445 podman[230212]: 2026-01-26 10:07:10.042408038 +0000 UTC m=+0.122210141 container start 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 05:07:10 np0005595445 bash[230212]: 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669
Jan 26 05:07:10 np0005595445 podman[230212]: 2026-01-26 10:07:09.956394232 +0000 UTC m=+0.036196355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:07:10 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 05:07:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:10 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:07:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:10.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:11 np0005595445 nova_compute[226322]: 2026-01-26 10:07:11.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:12 np0005595445 podman[230270]: 2026-01-26 10:07:12.281454231 +0000 UTC m=+0.051263580 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:07:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:07:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:12.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:07:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:07:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:07:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:13.632 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:07:14 np0005595445 nova_compute[226322]: 2026-01-26 10:07:14.129 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001999997s ======
Jan 26 05:07:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:14.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001999997s
Jan 26 05:07:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000999999s ======
Jan 26 05:07:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:14.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000999999s
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.080793) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036080844, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 936, "num_deletes": 251, "total_data_size": 2039893, "memory_usage": 2058904, "flush_reason": "Manual Compaction"}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036147279, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1342026, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25071, "largest_seqno": 26002, "table_properties": {"data_size": 1337658, "index_size": 2020, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9870, "raw_average_key_size": 19, "raw_value_size": 1328751, "raw_average_value_size": 2678, "num_data_blocks": 89, "num_entries": 496, "num_filter_entries": 496, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769421968, "oldest_key_time": 1769421968, "file_creation_time": 1769422036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 66523 microseconds, and 3981 cpu microseconds.
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.147321) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1342026 bytes OK
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.147339) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166150) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166204) EVENT_LOG_v1 {"time_micros": 1769422036166196, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166227) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2035166, prev total WAL file size 2035447, number of live WAL files 2.
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.167132) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1310KB)], [48(12MB)]
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036167169, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14796416, "oldest_snapshot_seqno": -1}
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5370 keys, 12632842 bytes, temperature: kUnknown
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036359222, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12632842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12597349, "index_size": 20983, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137884, "raw_average_key_size": 25, "raw_value_size": 12500421, "raw_average_value_size": 2327, "num_data_blocks": 852, "num_entries": 5370, "num_filter_entries": 5370, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422036, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.359426) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12632842 bytes
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.361085) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.0 rd, 65.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.8 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(20.4) write-amplify(9.4) OK, records in: 5890, records dropped: 520 output_compression: NoCompression
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.361100) EVENT_LOG_v1 {"time_micros": 1769422036361093, "job": 28, "event": "compaction_finished", "compaction_time_micros": 192112, "compaction_time_cpu_micros": 23774, "output_level": 6, "num_output_files": 1, "total_output_size": 12632842, "num_input_records": 5890, "num_output_records": 5370, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036361785, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422036363816, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.166996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363916) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:07:16.363921) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:07:16 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:16 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:07:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:16.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:16 np0005595445 nova_compute[226322]: 2026-01-26 10:07:16.971 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100718 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:07:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:19 np0005595445 nova_compute[226322]: 2026-01-26 10:07:19.131 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:20.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:21 np0005595445 nova_compute[226322]: 2026-01-26 10:07:21.973 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000026:nfs.cephfs.0: -2
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 26 05:07:22 np0005595445 nova_compute[226322]: 2026-01-26 10:07:22.683 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:22 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:22 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe38c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:22.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:22.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:23 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:23 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:24 np0005595445 nova_compute[226322]: 2026-01-26 10:07:24.177 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:24 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:07:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:24.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:07:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:24.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100725 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 26 05:07:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:25 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:25 np0005595445 nova_compute[226322]: 2026-01-26 10:07:25.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:25 np0005595445 nova_compute[226322]: 2026-01-26 10:07:25.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:07:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:25 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:26 np0005595445 nova_compute[226322]: 2026-01-26 10:07:26.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:26 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:26 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:26.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:26.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:26 np0005595445 nova_compute[226322]: 2026-01-26 10:07:26.975 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:27 np0005595445 podman[230338]: 2026-01-26 10:07:27.291254261 +0000 UTC m=+0.076701648 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:07:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:27 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.900 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 05:07:27 np0005595445 nova_compute[226322]: 2026-01-26 10:07:27.901 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:07:27 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:27 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:28 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:28.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:28.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:29 np0005595445 nova_compute[226322]: 2026-01-26 10:07:29.179 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:29 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:29 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:30 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:30 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:30.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.109 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.128 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.128 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.129 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.130 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.130 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.131 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.132 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.132 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.150 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.151 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:07:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:07:31 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3033423951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.569 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:07:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:31 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.768 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.768 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000001 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.934 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.935 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4716MB free_disk=59.92194747924805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:07:31 np0005595445 nova_compute[226322]: 2026-01-26 10:07:31.979 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:31 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:31 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance e89171d5-00d1-406a-bcc8-340bfdacbcbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.016 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.056 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/521770293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.477 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.482 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.733 226326 ERROR nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [req-0d619a14-1168-4598-a18e-0c9a74f80743] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID d06842a0-5d13-4573-bb78-d433bbb380e4.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-0d619a14-1168-4598-a18e-0c9a74f80743"}]}#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.750 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:07:32 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:32 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3680016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.779 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.779 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.799 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.820 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:07:32 np0005595445 nova_compute[226322]: 2026-01-26 10:07:32.852 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:07:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:07:33 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049851893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.296 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.301 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.348 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updated inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.349 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.349 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.378 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:07:33 np0005595445 nova_compute[226322]: 2026-01-26 10:07:33.379 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:07:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:33 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3700023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:33 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:34 np0005595445 nova_compute[226322]: 2026-01-26 10:07:34.225 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:34 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100734 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:07:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:34.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:34.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:35 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:35 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:35 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe3700023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:36 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:36 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:36.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:36.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:36 np0005595445 nova_compute[226322]: 2026-01-26 10:07:36.982 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:37 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:37 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:37 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:38 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370002580 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:38 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 05:07:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:38.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:38.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:39 np0005595445 nova_compute[226322]: 2026-01-26 10:07:39.230 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:39 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:40 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:40 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:40 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:07:40 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:40 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:40.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:40.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:41 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:41 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:41 np0005595445 nova_compute[226322]: 2026-01-26 10:07:41.984 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:42 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:42 np0005595445 podman[230574]: 2026-01-26 10:07:42.633553724 +0000 UTC m=+0.055930654 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 05:07:42 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:42 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:07:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:07:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:07:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:42.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:07:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:43 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:07:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:43 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:44 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:44 np0005595445 nova_compute[226322]: 2026-01-26 10:07:44.232 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:44 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:44.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:07:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:44.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:07:45 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:45 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:07:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:07:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:07:46 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:46 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:46.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:46.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:46 np0005595445 nova_compute[226322]: 2026-01-26 10:07:46.986 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:47 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:47 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:48 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:48 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:07:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:48.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:07:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:48.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:49 : epoch 69773cce : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 26 05:07:49 np0005595445 nova_compute[226322]: 2026-01-26 10:07:49.235 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:49 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370003730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:50 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe374003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:50 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 26 05:07:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:50.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:51 np0005595445 kernel: ganesha.nfsd[230323]: segfault at 50 ip 00007fe4163f932e sp 00007fe3c0ff8210 error 4 in libntirpc.so.5.8[7fe4163de000+2c000] likely on CPU 4 (core 0, socket 4)
Jan 26 05:07:51 np0005595445 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 26 05:07:51 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230227]: 26/01/2026 10:07:51 : epoch 69773cce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe384003cc0 fd 48 proxy ignored for local
Jan 26 05:07:51 np0005595445 systemd[1]: Started Process Core Dump (PID 230603/UID 0).
Jan 26 05:07:51 np0005595445 nova_compute[226322]: 2026-01-26 10:07:51.902 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:51 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:51.901 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:07:51 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:51.903 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:07:52 np0005595445 nova_compute[226322]: 2026-01-26 10:07:52.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:52 np0005595445 systemd-coredump[230604]: Process 230231 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007fe4163f932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Jan 26 05:07:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:07:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:07:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:07:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:52.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:07:52 np0005595445 systemd[1]: systemd-coredump@11-230603-0.service: Deactivated successfully.
Jan 26 05:07:52 np0005595445 systemd[1]: systemd-coredump@11-230603-0.service: Consumed 1.175s CPU time.
Jan 26 05:07:53 np0005595445 podman[230610]: 2026-01-26 10:07:53.027847227 +0000 UTC m=+0.020040009 container died 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 05:07:53 np0005595445 systemd[1]: var-lib-containers-storage-overlay-e984c40ba02c1dc8b9e293440818da050517ac9c1b5775a04f860e5f572a1a4e-merged.mount: Deactivated successfully.
Jan 26 05:07:53 np0005595445 podman[230610]: 2026-01-26 10:07:53.060193223 +0000 UTC m=+0.052385975 container remove 9c1ca928cfedd7dde65d34240d8625de067d4d53061cc47a3e852c486a704669 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Jan 26 05:07:53 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Main process exited, code=exited, status=139/n/a
Jan 26 05:07:53 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Failed with result 'exit-code'.
Jan 26 05:07:53 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.551s CPU time.
Jan 26 05:07:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.930 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:07:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:07:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:07:54 np0005595445 nova_compute[226322]: 2026-01-26 10:07:54.237 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:54 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:07:54.905 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:07:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:07:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:54.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:07:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:54.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:56.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:56.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:57 np0005595445 nova_compute[226322]: 2026-01-26 10:07:57.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:57 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/100757 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:07:58 np0005595445 podman[230657]: 2026-01-26 10:07:58.294319228 +0000 UTC m=+0.075047488 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 05:07:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:07:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:07:58.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:07:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:07:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:07:58.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:07:59 np0005595445 nova_compute[226322]: 2026-01-26 10:07:59.278 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:07:59 np0005595445 ovn_controller[133670]: 2026-01-26T10:07:59Z|00032|binding|INFO|Releasing lport 87777a78-2fc8-466a-8f48-d1f2b973e0a9 from this chassis (sb_readonly=0)
Jan 26 05:08:00 np0005595445 nova_compute[226322]: 2026-01-26 10:08:00.031 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:00.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:00.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG nova.compute.manager [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-changed-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG nova.compute.manager [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing instance network info cache due to event network-changed-2a711799-8550-4432-83d1-93c9598eaa25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.347 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.348 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.348 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Refreshing network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.458 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.459 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.460 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.460 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.461 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.463 226326 INFO nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Terminating instance#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.465 226326 DEBUG nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 05:08:01 np0005595445 kernel: tap2a711799-85 (unregistering): left promiscuous mode
Jan 26 05:08:01 np0005595445 NetworkManager[49073]: <info>  [1769422081.5633] device (tap2a711799-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 05:08:01 np0005595445 ovn_controller[133670]: 2026-01-26T10:08:01Z|00033|binding|INFO|Releasing lport 2a711799-8550-4432-83d1-93c9598eaa25 from this chassis (sb_readonly=0)
Jan 26 05:08:01 np0005595445 ovn_controller[133670]: 2026-01-26T10:08:01Z|00034|binding|INFO|Setting lport 2a711799-8550-4432-83d1-93c9598eaa25 down in Southbound
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.574 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 ovn_controller[133670]: 2026-01-26T10:08:01Z|00035|binding|INFO|Removing iface tap2a711799-85 ovn-installed in OVS
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.585 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:cd:f5 10.100.0.8'], port_security=['fa:16:3e:30:cd:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e89171d5-00d1-406a-bcc8-340bfdacbcbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9f16b9a-82ea-4f86-8e79-c292f5efcb52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aeef230-e621-407c-aaae-8f0a628ce92a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=2a711799-8550-4432-83d1-93c9598eaa25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.586 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 2a711799-8550-4432-83d1-93c9598eaa25 in datapath 7939c8f9-060e-41e0-9e41-ebecf5f62dfb unbound from our chassis#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.588 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7939c8f9-060e-41e0-9e41-ebecf5f62dfb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.589 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[afcec71b-a998-4226-bd3c-33d5e83d8758]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.589 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb namespace which is not needed anymore#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.596 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 26 05:08:01 np0005595445 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.458s CPU time.
Jan 26 05:08:01 np0005595445 systemd-machined[194876]: Machine qemu-1-instance-00000001 terminated.
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.700 226326 INFO nova.virt.libvirt.driver [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Instance destroyed successfully.#033[00m
Jan 26 05:08:01 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : haproxy version is 2.8.14-c23fe91
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.701 226326 DEBUG nova.objects.instance [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid e89171d5-00d1-406a-bcc8-340bfdacbcbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:08:01 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [NOTICE]   (230023) : path to executable is /usr/sbin/haproxy
Jan 26 05:08:01 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [WARNING]  (230023) : Exiting Master process...
Jan 26 05:08:01 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [ALERT]    (230023) : Current worker (230026) exited with code 143 (Terminated)
Jan 26 05:08:01 np0005595445 neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb[230019]: [WARNING]  (230023) : All workers exited. Exiting... (0)
Jan 26 05:08:01 np0005595445 systemd[1]: libpod-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope: Deactivated successfully.
Jan 26 05:08:01 np0005595445 podman[230709]: 2026-01-26 10:08:01.71308444 +0000 UTC m=+0.040047167 container died 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.729 226326 DEBUG nova.virt.libvirt.vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1457730655',display_name='tempest-TestNetworkBasicOps-server-1457730655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1457730655',id=1,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzAq4PU9BtYDVE2893QINCbdmYXSOCaAokqiTddkRve8RypnqtELDxIf91MHtzezEtPVmgAZuG9SeAnge4WS1FhmBBt7X/P4oQbB5mob/LapxwgpslluUHdSteOPotImA==',key_name='tempest-TestNetworkBasicOps-1466537094',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:06:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-snk42cm3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:06:43Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=e89171d5-00d1-406a-bcc8-340bfdacbcbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.730 226326 DEBUG nova.network.os_vif_util [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.731 226326 DEBUG nova.network.os_vif_util [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.731 226326 DEBUG os_vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.733 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a711799-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:08:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6-userdata-shm.mount: Deactivated successfully.
Jan 26 05:08:01 np0005595445 systemd[1]: var-lib-containers-storage-overlay-6308204caa6ef9ef44a497d3a0e6f90a3cb0448655da17487cc7918118dc819d-merged.mount: Deactivated successfully.
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.768 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.772 226326 INFO os_vif [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:cd:f5,bridge_name='br-int',has_traffic_filtering=True,id=2a711799-8550-4432-83d1-93c9598eaa25,network=Network(7939c8f9-060e-41e0-9e41-ebecf5f62dfb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a711799-85')#033[00m
Jan 26 05:08:01 np0005595445 podman[230709]: 2026-01-26 10:08:01.773988656 +0000 UTC m=+0.100951383 container cleanup 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:08:01 np0005595445 systemd[1]: libpod-conmon-66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6.scope: Deactivated successfully.
Jan 26 05:08:01 np0005595445 podman[230755]: 2026-01-26 10:08:01.830881352 +0000 UTC m=+0.037955886 container remove 66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.836 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[924f91d8-05a9-4eb6-93bc-4e3242e50629]: (4, ('Mon Jan 26 10:08:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb (66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6)\n66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6\nMon Jan 26 10:08:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb (66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6)\n66c1d839ded6d2f02f516ea9c9da1599d83955c3a4c1a581e719d12d95ba47e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.837 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ef493ef8-457c-4175-a88e-a4142ebb63ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.839 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7939c8f9-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:08:01 np0005595445 kernel: tap7939c8f9-00: left promiscuous mode
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.844 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[815448fc-659f-4cd7-9378-a13e1e1e8dd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 nova_compute[226322]: 2026-01-26 10:08:01.854 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.862 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c043c093-41fb-4aa9-9c14-92500e49ade5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.863 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[179b3520-6e16-4652-8c9d-941170dda6ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.874 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa69c9d-ee54-4e78-997f-4b2d855a2a67]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396734, 'reachable_time': 32982, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230784, 'error': None, 'target': 'ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:01 np0005595445 systemd[1]: run-netns-ovnmeta\x2d7939c8f9\x2d060e\x2d41e0\x2d9e41\x2debecf5f62dfb.mount: Deactivated successfully.
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.883 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7939c8f9-060e-41e0-9e41-ebecf5f62dfb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 05:08:01 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:01.884 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[7d080ee7-3578-4f18-b6f0-7ea0d369fe46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:08:02 np0005595445 nova_compute[226322]: 2026-01-26 10:08:02.780 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updated VIF entry in instance network info cache for port 2a711799-8550-4432-83d1-93c9598eaa25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:08:02 np0005595445 nova_compute[226322]: 2026-01-26 10:08:02.781 226326 DEBUG nova.network.neutron [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [{"id": "2a711799-8550-4432-83d1-93c9598eaa25", "address": "fa:16:3e:30:cd:f5", "network": {"id": "7939c8f9-060e-41e0-9e41-ebecf5f62dfb", "bridge": "br-int", "label": "tempest-network-smoke--1698349963", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a711799-85", "ovs_interfaceid": "2a711799-8550-4432-83d1-93c9598eaa25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:08:02 np0005595445 nova_compute[226322]: 2026-01-26 10:08:02.826 226326 DEBUG oslo_concurrency.lockutils [req-c639b0af-8c3a-471c-800e-2e573c44ac69 req-b14e9264-6dc7-4960-978c-5dff71c0282b b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-e89171d5-00d1-406a-bcc8-340bfdacbcbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:08:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:02.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:02.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.153 226326 INFO nova.virt.libvirt.driver [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deleting instance files /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc_del#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.154 226326 INFO nova.virt.libvirt.driver [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deletion of /var/lib/nova/instances/e89171d5-00d1-406a-bcc8-340bfdacbcbc_del complete#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.219 226326 DEBUG nova.virt.libvirt.host [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.220 226326 INFO nova.virt.libvirt.host [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] UEFI support detected#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 INFO nova.compute.manager [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 1.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG oslo.service.loopingcall [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.222 226326 DEBUG nova.network.neutron [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.428 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.429 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-unplugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.430 226326 DEBUG oslo_concurrency.lockutils [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.431 226326 DEBUG nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] No waiting events found dispatching network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.431 226326 WARNING nova.compute.manager [req-0cec8d4c-0146-4df3-bfd7-1d9b2af7fb20 req-8e3e9bf0-ba6d-4cce-919a-b1b921c69372 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received unexpected event network-vif-plugged-2a711799-8550-4432-83d1-93c9598eaa25 for instance with vm_state active and task_state deleting.#033[00m
Jan 26 05:08:03 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Scheduled restart job, restart counter is at 12.
Jan 26 05:08:03 np0005595445 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:08:03 np0005595445 systemd[1]: ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30@nfs.cephfs.0.0.compute-1.thyhvc.service: Consumed 1.551s CPU time.
Jan 26 05:08:03 np0005595445 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30...
Jan 26 05:08:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:03 np0005595445 podman[230864]: 2026-01-26 10:08:03.767589324 +0000 UTC m=+0.053892521 container create 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:08:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 26 05:08:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:08:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:08:03 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f886142fbe5c7c4ceb50c688ee912036a0bd37edf44bca34f52bdc51ce28cbf9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.thyhvc-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 26 05:08:03 np0005595445 podman[230864]: 2026-01-26 10:08:03.742060029 +0000 UTC m=+0.028363266 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:08:03 np0005595445 podman[230864]: 2026-01-26 10:08:03.834750056 +0000 UTC m=+0.121053243 container init 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:08:03 np0005595445 podman[230864]: 2026-01-26 10:08:03.848666972 +0000 UTC m=+0.134970149 container start 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 26 05:08:03 np0005595445 bash[230864]: 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3
Jan 26 05:08:03 np0005595445 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.thyhvc for 1a70b85d-e3fd-5814-8a6a-37ea00fcae30.
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.970 226326 DEBUG nova.network.neutron [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 26 05:08:03 np0005595445 nova_compute[226322]: 2026-01-26 10:08:03.990 226326 INFO nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Took 0.77 seconds to deallocate network for instance.#033[00m
Jan 26 05:08:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 26 05:08:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.064 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.065 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.121 226326 DEBUG oslo_concurrency.processutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.150 226326 DEBUG nova.compute.manager [req-31d93124-1aea-483d-9968-14190943187e req-5361482f-372d-4dab-8d55-d7a25f53c306 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Received event network-vif-deleted-2a711799-8550-4432-83d1-93c9598eaa25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:08:04 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3400737954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.640 226326 DEBUG oslo_concurrency.processutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.645 226326 DEBUG nova.compute.provider_tree [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.683 226326 DEBUG nova.scheduler.client.report [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.704 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.729 226326 INFO nova.scheduler.client.report [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance e89171d5-00d1-406a-bcc8-340bfdacbcbc#033[00m
Jan 26 05:08:04 np0005595445 nova_compute[226322]: 2026-01-26 10:08:04.916 226326 DEBUG oslo_concurrency.lockutils [None req-512a4514-4bba-4df3-a8c5-5d3094974e50 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "e89171d5-00d1-406a-bcc8-340bfdacbcbc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:04.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:04.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:06 np0005595445 nova_compute[226322]: 2026-01-26 10:08:06.857 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:06.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:06.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:08 np0005595445 nova_compute[226322]: 2026-01-26 10:08:08.289 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:08 np0005595445 nova_compute[226322]: 2026-01-26 10:08:08.367 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:08.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:08.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:09 np0005595445 nova_compute[226322]: 2026-01-26 10:08:09.322 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:10 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:10 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:10.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:10.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:11 np0005595445 nova_compute[226322]: 2026-01-26 10:08:11.859 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:12.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:12.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:13 np0005595445 podman[230950]: 2026-01-26 10:08:13.271870441 +0000 UTC m=+0.051585424 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 05:08:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:14 np0005595445 nova_compute[226322]: 2026-01-26 10:08:14.324 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:14.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:14.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:15 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:15 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:16 np0005595445 nova_compute[226322]: 2026-01-26 10:08:16.700 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422081.6982865, e89171d5-00d1-406a-bcc8-340bfdacbcbc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:08:16 np0005595445 nova_compute[226322]: 2026-01-26 10:08:16.700 226326 INFO nova.compute.manager [-] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] VM Stopped (Lifecycle Event)#033[00m
Jan 26 05:08:16 np0005595445 nova_compute[226322]: 2026-01-26 10:08:16.924 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:16 np0005595445 nova_compute[226322]: 2026-01-26 10:08:16.960 226326 DEBUG nova.compute.manager [None req-afcc0dc7-3569-4c31-8d19-3d67fd7250ff - - - - - -] [instance: e89171d5-00d1-406a-bcc8-340bfdacbcbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:08:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:17.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:18.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:19.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:19 np0005595445 nova_compute[226322]: 2026-01-26 10:08:19.380 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:20 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:20 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:20.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:08:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:21.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:08:21 np0005595445 nova_compute[226322]: 2026-01-26 10:08:21.925 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:22.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:23.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:24 np0005595445 nova_compute[226322]: 2026-01-26 10:08:24.381 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:24.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:25 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:25.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:26 np0005595445 nova_compute[226322]: 2026-01-26 10:08:26.986 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:26.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:27.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:29.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:29 np0005595445 podman[231006]: 2026-01-26 10:08:29.351592762 +0000 UTC m=+0.127440813 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 05:08:29 np0005595445 nova_compute[226322]: 2026-01-26 10:08:29.383 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:29 np0005595445 nova_compute[226322]: 2026-01-26 10:08:29.933 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:29 np0005595445 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:29 np0005595445 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:08:29 np0005595445 nova_compute[226322]: 2026-01-26 10:08:29.934 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.088 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.089 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.090 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.090 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.091 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.112 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.113 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:08:30 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:08:30 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2323226978' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.567 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.717 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4932MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.718 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.797 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.797 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:08:30 np0005595445 nova_compute[226322]: 2026-01-26 10:08:30.813 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:08:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:31.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:31.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:08:31 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2213818827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.288 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.294 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.308 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.327 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.327 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.925 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.925 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:08:31 np0005595445 nova_compute[226322]: 2026-01-26 10:08:31.989 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:08:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:33.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:08:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:33.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:34 np0005595445 nova_compute[226322]: 2026-01-26 10:08:34.386 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:35.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:35.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:36 np0005595445 nova_compute[226322]: 2026-01-26 10:08:36.992 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:37.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:08:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:37.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:08:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:39.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:39 np0005595445 nova_compute[226322]: 2026-01-26 10:08:39.388 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:41.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:41.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:08:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:08:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:08:41 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:08:41 np0005595445 nova_compute[226322]: 2026-01-26 10:08:41.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:42 np0005595445 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 05:08:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:43.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:43.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:44 np0005595445 podman[231192]: 2026-01-26 10:08:44.279471278 +0000 UTC m=+0.055624986 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 05:08:44 np0005595445 ovn_controller[133670]: 2026-01-26T10:08:44Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 05:08:44 np0005595445 nova_compute[226322]: 2026-01-26 10:08:44.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:45.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:45.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:08:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:08:46 np0005595445 nova_compute[226322]: 2026-01-26 10:08:46.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:08:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:08:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:49.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:49 np0005595445 nova_compute[226322]: 2026-01-26 10:08:49.431 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:51.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:51.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:51 np0005595445 nova_compute[226322]: 2026-01-26 10:08:51.998 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:53.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:08:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:53.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:08:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.931 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:08:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:08:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:08:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:08:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:54 np0005595445 nova_compute[226322]: 2026-01-26 10:08:54.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:55.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:55.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:57 np0005595445 nova_compute[226322]: 2026-01-26 10:08:57.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:08:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:08:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:57.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:08:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:57.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:08:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:08:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:08:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:08:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:08:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:08:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:08:59.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:08:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:08:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:08:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:08:59 np0005595445 nova_compute[226322]: 2026-01-26 10:08:59.471 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:00 np0005595445 podman[231249]: 2026-01-26 10:09:00.306931041 +0000 UTC m=+0.089330055 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:09:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:01.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:02 np0005595445 nova_compute[226322]: 2026-01-26 10:09:02.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:03.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:03.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:03 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:03.823 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:09:03 np0005595445 nova_compute[226322]: 2026-01-26 10:09:03.824 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:03 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:03.824 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:09:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:04 np0005595445 nova_compute[226322]: 2026-01-26 10:09:04.474 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:05.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:05.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:07 np0005595445 nova_compute[226322]: 2026-01-26 10:09:07.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:07.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:07.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:09.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:09.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:09 np0005595445 nova_compute[226322]: 2026-01-26 10:09:09.527 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:11.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:11.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:12 np0005595445 nova_compute[226322]: 2026-01-26 10:09:12.055 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:12.827 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:13.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:13.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:14 np0005595445 nova_compute[226322]: 2026-01-26 10:09:14.531 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:15.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:15.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:15 np0005595445 podman[231311]: 2026-01-26 10:09:15.282494746 +0000 UTC m=+0.054357408 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:09:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:17.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:17 np0005595445 nova_compute[226322]: 2026-01-26 10:09:17.093 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:17.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:19.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:19.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:19 np0005595445 nova_compute[226322]: 2026-01-26 10:09:19.600 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:21.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:22 np0005595445 nova_compute[226322]: 2026-01-26 10:09:22.146 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:23.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:23.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:24 np0005595445 nova_compute[226322]: 2026-01-26 10:09:24.601 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:24 np0005595445 nova_compute[226322]: 2026-01-26 10:09:24.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:25.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:27.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:27.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:27 np0005595445 nova_compute[226322]: 2026-01-26 10:09:27.147 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:28 np0005595445 nova_compute[226322]: 2026-01-26 10:09:28.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:29.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:29.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.603 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.680 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.707 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.708 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:29 np0005595445 nova_compute[226322]: 2026-01-26 10:09:29.709 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.707 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.708 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:09:30 np0005595445 nova_compute[226322]: 2026-01-26 10:09:30.709 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:31.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:31.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:09:31 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1839213161' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.152 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:31 np0005595445 podman[231388]: 2026-01-26 10:09:31.303449877 +0000 UTC m=+0.078947526 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.325 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.326 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4945MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.326 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.327 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.483 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.483 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.496 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:09:31 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/267005787' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.954 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.959 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.980 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.981 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:09:31 np0005595445 nova_compute[226322]: 2026-01-26 10:09:31.982 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:32 np0005595445 nova_compute[226322]: 2026-01-26 10:09:32.149 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:32 np0005595445 nova_compute[226322]: 2026-01-26 10:09:32.981 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:32 np0005595445 nova_compute[226322]: 2026-01-26 10:09:32.982 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:09:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:33.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:33.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:34 np0005595445 nova_compute[226322]: 2026-01-26 10:09:34.604 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000013s ======
Jan 26 05:09:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:35.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000013s
Jan 26 05:09:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:35.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:37.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:37.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:37 np0005595445 nova_compute[226322]: 2026-01-26 10:09:37.151 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:39.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:09:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:39.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:09:39 np0005595445 nova_compute[226322]: 2026-01-26 10:09:39.606 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:41.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:41.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:42 np0005595445 nova_compute[226322]: 2026-01-26 10:09:42.154 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:43.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:43.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:44 np0005595445 nova_compute[226322]: 2026-01-26 10:09:44.651 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:45.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:45.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:45 np0005595445 podman[231496]: 2026-01-26 10:09:45.668606711 +0000 UTC m=+0.049415888 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:09:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:09:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.095 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.095 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.117 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 05:09:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:47.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.180 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.197 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.198 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.202 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.202 226326 INFO nova.compute.claims [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.313 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:09:47 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1238899685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.767 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.772 226326 DEBUG nova.compute.provider_tree [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.795 226326 DEBUG nova.scheduler.client.report [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.826 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.827 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.884 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.885 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.906 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 05:09:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:09:47 np0005595445 nova_compute[226322]: 2026-01-26 10:09:47.929 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.013 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.015 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.016 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating image(s)#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.054 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.081 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.107 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.111 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.186 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.187 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.188 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.188 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.214 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.217 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.477 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.542 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 05:09:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.646 226326 DEBUG nova.objects.instance [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.670 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.670 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Ensure instance console log exists: /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.671 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:48 np0005595445 nova_compute[226322]: 2026-01-26 10:09:48.692 226326 DEBUG nova.policy [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 05:09:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:49.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:49 np0005595445 nova_compute[226322]: 2026-01-26 10:09:49.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:50 np0005595445 nova_compute[226322]: 2026-01-26 10:09:50.806 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Successfully created port: a3edb0bf-fece-4486-bde7-200b47a3207f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 05:09:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:09:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:09:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:51.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.182 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.372 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Successfully updated port: a3edb0bf-fece-4486-bde7-200b47a3207f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.398 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.399 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.399 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.523 226326 DEBUG nova.compute.manager [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.524 226326 DEBUG nova.compute.manager [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.524 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:09:52 np0005595445 nova_compute[226322]: 2026-01-26 10:09:52.658 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 05:09:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:09:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:09:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000024s ======
Jan 26 05:09:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Jan 26 05:09:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.933 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.934 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:53.934 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.014 226326 DEBUG nova.network.neutron [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.039 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.039 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance network_info: |[{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.040 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.040 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.043 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start _get_guest_xml network_info=[{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.046 226326 WARNING nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.052 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.053 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.055 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.libvirt.host [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.056 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.057 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.058 226326 DEBUG nova.virt.hardware [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.061 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:09:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1908295695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.489 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.510 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.513 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:09:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2437790993' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.958 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.959 226326 DEBUG nova.virt.libvirt.vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:09:47Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.959 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.960 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.961 226326 DEBUG nova.objects.instance [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.982 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] End _get_guest_xml xml=<domain type="kvm">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <uuid>eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</uuid>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <name>instance-00000004</name>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <memory>131072</memory>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <vcpu>1</vcpu>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <metadata>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:name>tempest-TestNetworkBasicOps-server-1372750228</nova:name>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:creationTime>2026-01-26 10:09:54</nova:creationTime>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:flavor name="m1.nano">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:memory>128</nova:memory>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:disk>1</nova:disk>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:swap>0</nova:swap>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:vcpus>1</nova:vcpus>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </nova:flavor>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:owner>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </nova:owner>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <nova:ports>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <nova:port uuid="a3edb0bf-fece-4486-bde7-200b47a3207f">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        </nova:port>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </nova:ports>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </nova:instance>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </metadata>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <sysinfo type="smbios">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <system>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="manufacturer">RDO</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="product">OpenStack Compute</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="serial">eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="uuid">eafb89b3-bd76-4bd7-9ba0-c169e9e02d17</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <entry name="family">Virtual Machine</entry>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </system>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </sysinfo>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <os>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <boot dev="hd"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <smbios mode="sysinfo"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <acpi/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <apic/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <vmcoreinfo/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <clock offset="utc">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <timer name="hpet" present="no"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </clock>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <cpu mode="host-model" match="exact">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <disk type="network" device="disk">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <target dev="vda" bus="virtio"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <disk type="network" device="cdrom">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <target dev="sda" bus="sata"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <interface type="ethernet">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <mac address="fa:16:3e:93:05:f9"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <mtu size="1442"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <target dev="tapa3edb0bf-fe"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <serial type="pty">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <log file="/var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/console.log" append="off"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </serial>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <video>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <input type="tablet" bus="usb"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <rng model="virtio">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <backend model="random">/dev/urandom</backend>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <controller type="usb" index="0"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    <memballoon model="virtio">
Jan 26 05:09:54 np0005595445 nova_compute[226322]:      <stats period="10"/>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:    </memballoon>
Jan 26 05:09:54 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:09:54 np0005595445 nova_compute[226322]: </domain>
Jan 26 05:09:54 np0005595445 nova_compute[226322]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.984 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Preparing to wait for external event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.985 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.985 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.986 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.987 226326 DEBUG nova.virt.libvirt.vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:09:47Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.988 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.989 226326 DEBUG nova.network.os_vif_util [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.989 226326 DEBUG os_vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.991 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.992 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.997 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.997 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3edb0bf-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:54 np0005595445 nova_compute[226322]: 2026-01-26 10:09:54.999 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa3edb0bf-fe, col_values=(('external_ids', {'iface-id': 'a3edb0bf-fece-4486-bde7-200b47a3207f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:05:f9', 'vm-uuid': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:55 np0005595445 NetworkManager[49073]: <info>  [1769422195.0019] manager: (tapa3edb0bf-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.009 226326 INFO os_vif [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe')#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.061 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.061 226326 DEBUG nova.network.neutron [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.064 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:93:05:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.065 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Using config drive#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.087 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.091 226326 DEBUG oslo_concurrency.lockutils [req-78ef31bd-333d-4c21-8bbe-825be9d9f479 req-30aa49d8-3ff1-41ec-9115-5c13badcc7d1 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:09:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:55 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.667 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Creating config drive at /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.673 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre49voty execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.807 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpre49voty" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.832 226326 DEBUG nova.storage.rbd_utils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:09:55 np0005595445 nova_compute[226322]: 2026-01-26 10:09:55.835 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.372 226326 DEBUG oslo_concurrency.processutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.373 226326 INFO nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deleting local config drive /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17/disk.config because it was imported into RBD.#033[00m
Jan 26 05:09:56 np0005595445 systemd[1]: Starting libvirt secret daemon...
Jan 26 05:09:56 np0005595445 systemd[1]: Started libvirt secret daemon.
Jan 26 05:09:56 np0005595445 kernel: tapa3edb0bf-fe: entered promiscuous mode
Jan 26 05:09:56 np0005595445 ovn_controller[133670]: 2026-01-26T10:09:56Z|00037|binding|INFO|Claiming lport a3edb0bf-fece-4486-bde7-200b47a3207f for this chassis.
Jan 26 05:09:56 np0005595445 ovn_controller[133670]: 2026-01-26T10:09:56Z|00038|binding|INFO|a3edb0bf-fece-4486-bde7-200b47a3207f: Claiming fa:16:3e:93:05:f9 10.100.0.6
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.4976] manager: (tapa3edb0bf-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.496 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.501 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.509 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:05:f9 10.100.0.6'], port_security=['fa:16:3e:93:05:f9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fcc6dd6-8be1-499a-bb83-b80d1cff1d47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83c6eda3-fd53-4ca0-bad4-44f3e56aac7f, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=a3edb0bf-fece-4486-bde7-200b47a3207f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.510 143326 INFO neutron.agent.ovn.metadata.agent [-] Port a3edb0bf-fece-4486-bde7-200b47a3207f in datapath 586582bb-e1bc-4aa4-9785-243b8df2dbcb bound to our chassis#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.512 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 586582bb-e1bc-4aa4-9785-243b8df2dbcb#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.522 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8e7be7-fb53-41c9-9d54-9fe5edbc1d06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.523 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap586582bb-e1 in ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.525 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap586582bb-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.525 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f2a8c5-41f9-43c7-9d4a-5b4dbe308321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 systemd-machined[194876]: New machine qemu-2-instance-00000004.
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.526 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4d89784d-fa91-4812-9869-c7a94ccc1de4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.536 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[604ffeee-e557-47a2-87f3-db8c47502f4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.562 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c52612ec-d65f-46ef-a286-528269320c31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 systemd-udevd[231950]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 ovn_controller[133670]: 2026-01-26T10:09:56Z|00039|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f ovn-installed in OVS
Jan 26 05:09:56 np0005595445 ovn_controller[133670]: 2026-01-26T10:09:56Z|00040|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f up in Southbound
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.578 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.5826] device (tapa3edb0bf-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.5839] device (tapa3edb0bf-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.590 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[f1368e9b-7932-4abb-9ffb-df4f88f03788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.5974] manager: (tap586582bb-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.596 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[314f8455-e4e3-46ef-958b-ddc7e0709f5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.621 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aaad4e-cc65-48c3-8789-bc1586aee5c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.625 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd4c300-c473-42cd-8c8f-6fcaab2ebfef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.6458] device (tap586582bb-e0): carrier: link connected
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.650 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d73dcb94-1598-4fca-9ff7-b5003d00f751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.665 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[212a0d26-34c2-4dd6-8760-86882d29ff9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap586582bb-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:be:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415721, 'reachable_time': 21005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231980, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.678 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[456378b0-b768-4023-88e8-0b546758560d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:bea7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415721, 'tstamp': 415721}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231981, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.695 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[38c9f083-197b-4c88-8402-d8cd3f8b3a69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap586582bb-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:be:a7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415721, 'reachable_time': 21005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231982, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.728 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[465182b2-359a-436f-be8b-d408b93a9efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.782 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[6c35b87d-4700-426e-bbbd-7c0bb63b71c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap586582bb-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.784 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap586582bb-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 NetworkManager[49073]: <info>  [1769422196.7868] manager: (tap586582bb-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 05:09:56 np0005595445 kernel: tap586582bb-e0: entered promiscuous mode
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.789 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.789 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap586582bb-e0, col_values=(('external_ids', {'iface-id': 'e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 ovn_controller[133670]: 2026-01-26T10:09:56Z|00041|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.821 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.822 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.823 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6503c1-6a77-4405-a3ce-30aa148249dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.825 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: global
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    log         /dev/log local0 debug
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    log-tag     haproxy-metadata-proxy-586582bb-e1bc-4aa4-9785-243b8df2dbcb
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    user        root
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    group       root
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    maxconn     1024
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    pidfile     /var/lib/neutron/external/pids/586582bb-e1bc-4aa4-9785-243b8df2dbcb.pid.haproxy
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    daemon
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: defaults
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    log global
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    mode http
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    option httplog
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    option dontlognull
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    option http-server-close
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    option forwardfor
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    retries                 3
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    timeout http-request    30s
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    timeout connect         30s
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    timeout client          32s
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    timeout server          32s
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    timeout http-keep-alive 30s
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: listen listener
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    bind 169.254.169.254:80
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]:    http-request add-header X-OVN-Network-ID 586582bb-e1bc-4aa4-9785-243b8df2dbcb
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 05:09:56 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:09:56.829 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'env', 'PROCESS_TAG=haproxy-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/586582bb-e1bc-4aa4-9785-243b8df2dbcb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.918 226326 DEBUG nova.compute.manager [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.918 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG oslo_concurrency.lockutils [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:56 np0005595445 nova_compute[226322]: 2026-01-26 10:09:56.919 226326 DEBUG nova.compute.manager [req-30538185-e811-4e35-9b27-8e46b84690c8 req-25d17654-f4df-4980-919c-36800ab7912d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Processing event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 05:09:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:57.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:57 np0005595445 podman[232049]: 2026-01-26 10:09:57.249742871 +0000 UTC m=+0.053873667 container create 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.274 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.2742188, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.275 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Started (Lifecycle Event)#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.279 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 05:09:57 np0005595445 systemd[1]: Started libpod-conmon-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope.
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.283 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.286 226326 INFO nova.virt.libvirt.driver [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance spawned successfully.#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.286 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 05:09:57 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:09:57 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3f80d88a6c2d2bff340e38e70bb3d56800b97343639f380b5d8e3058e1a4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 05:09:57 np0005595445 podman[232049]: 2026-01-26 10:09:57.222156006 +0000 UTC m=+0.026286852 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 05:09:57 np0005595445 podman[232049]: 2026-01-26 10:09:57.325909545 +0000 UTC m=+0.130040371 container init 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.328 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:09:57 np0005595445 podman[232049]: 2026-01-26 10:09:57.331470821 +0000 UTC m=+0.135601617 container start 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.334 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.337 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.338 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.338 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.339 226326 DEBUG nova.virt.libvirt.driver [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:09:57 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : New worker (232075) forked
Jan 26 05:09:57 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : Loading success.
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.389 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.390 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.2744265, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.390 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Paused (Lifecycle Event)#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.425 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.429 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422197.281973, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.429 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Resumed (Lifecycle Event)#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.437 226326 INFO nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 9.42 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.437 226326 DEBUG nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.484 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.488 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.525 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.537 226326 INFO nova.compute.manager [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 10.37 seconds to build instance.#033[00m
Jan 26 05:09:57 np0005595445 nova_compute[226322]: 2026-01-26 10:09:57.557 226326 DEBUG oslo_concurrency.lockutils [None req-4c379447-2374-45bc-bbc6-0178c3224c6e c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:09:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:09:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:09:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1722280942' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:09:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.998 226326 DEBUG nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.998 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG oslo_concurrency.lockutils [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 DEBUG nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:09:58 np0005595445 nova_compute[226322]: 2026-01-26 10:09:58.999 226326 WARNING nova.compute.manager [req-0e11bfb7-d2b0-45d4-be60-bbac55617caf req-f77ed1b6-72ce-4b00-93a5-386c536279d3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received unexpected event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with vm_state active and task_state None.#033[00m
Jan 26 05:09:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:09:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:09:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:09:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:09:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:09:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:09:59.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:09:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:09:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:09:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:09:59 np0005595445 nova_compute[226322]: 2026-01-26 10:09:59.680 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:00 np0005595445 nova_compute[226322]: 2026-01-26 10:10:00.002 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:00 np0005595445 ceph-mon[80107]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 05:10:00 np0005595445 ceph-mon[80107]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 05:10:00 np0005595445 ceph-mon[80107]:     osd.2 observed slow operation indications in BlueStore
Jan 26 05:10:00 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:00Z|00042|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 05:10:00 np0005595445 NetworkManager[49073]: <info>  [1769422200.9570] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 05:10:00 np0005595445 NetworkManager[49073]: <info>  [1769422200.9578] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 05:10:00 np0005595445 nova_compute[226322]: 2026-01-26 10:10:00.959 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:00 np0005595445 nova_compute[226322]: 2026-01-26 10:10:00.989 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:00 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:00Z|00043|binding|INFO|Releasing lport e7d448a7-b7a2-4f6b-a25b-09ac0981ee0c from this chassis (sb_readonly=0)
Jan 26 05:10:00 np0005595445 nova_compute[226322]: 2026-01-26 10:10:00.995 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:01 np0005595445 nova_compute[226322]: 2026-01-26 10:10:01.394 226326 DEBUG nova.compute.manager [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:10:01 np0005595445 nova_compute[226322]: 2026-01-26 10:10:01.394 226326 DEBUG nova.compute.manager [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:10:01 np0005595445 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:10:01 np0005595445 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:10:01 np0005595445 nova_compute[226322]: 2026-01-26 10:10:01.395 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:10:02 np0005595445 nova_compute[226322]: 2026-01-26 10:10:02.302 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:10:02 np0005595445 nova_compute[226322]: 2026-01-26 10:10:02.303 226326 DEBUG nova.network.neutron [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:10:02 np0005595445 nova_compute[226322]: 2026-01-26 10:10:02.322 226326 DEBUG oslo_concurrency.lockutils [req-5c5f7af9-fa47-4f8f-8992-89a887ee43d4 req-f03b095c-14b2-401c-9ca6-5785a3e4fdb9 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:10:02 np0005595445 podman[232088]: 2026-01-26 10:10:02.336011912 +0000 UTC m=+0.100097673 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 05:10:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:03.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:03.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:04 np0005595445 nova_compute[226322]: 2026-01-26 10:10:04.681 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:05 np0005595445 nova_compute[226322]: 2026-01-26 10:10:05.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:05.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:07.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:07.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:09.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:09.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:09 np0005595445 nova_compute[226322]: 2026-01-26 10:10:09.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:10 np0005595445 nova_compute[226322]: 2026-01-26 10:10:10.006 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:11.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:11.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:11 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:11Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:93:05:f9 10.100.0.6
Jan 26 05:10:11 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:11Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:93:05:f9 10.100.0.6
Jan 26 05:10:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000052s ======
Jan 26 05:10:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:13.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Jan 26 05:10:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:13.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:14 np0005595445 nova_compute[226322]: 2026-01-26 10:10:14.686 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:15 np0005595445 nova_compute[226322]: 2026-01-26 10:10:15.007 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:15.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:15.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:15 np0005595445 nova_compute[226322]: 2026-01-26 10:10:15.869 226326 INFO nova.compute.manager [None req-91a8e286-9707-48a9-bb85-8598d8c787c1 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Get console output#033[00m
Jan 26 05:10:15 np0005595445 nova_compute[226322]: 2026-01-26 10:10:15.877 230154 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 26 05:10:16 np0005595445 podman[232151]: 2026-01-26 10:10:16.312168071 +0000 UTC m=+0.082262374 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 05:10:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:17.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:18 np0005595445 nova_compute[226322]: 2026-01-26 10:10:18.945 226326 DEBUG nova.compute.manager [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:10:18 np0005595445 nova_compute[226322]: 2026-01-26 10:10:18.945 226326 DEBUG nova.compute.manager [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing instance network info cache due to event network-changed-a3edb0bf-fece-4486-bde7-200b47a3207f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:10:18 np0005595445 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:10:18 np0005595445 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:10:18 np0005595445 nova_compute[226322]: 2026-01-26 10:10:18.946 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Refreshing network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:10:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:19 np0005595445 nova_compute[226322]: 2026-01-26 10:10:19.060 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:19 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:19.060 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:10:19 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:19.061 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:10:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:19.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:19.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:19 np0005595445 nova_compute[226322]: 2026-01-26 10:10:19.689 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:20 np0005595445 nova_compute[226322]: 2026-01-26 10:10:20.008 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:20 np0005595445 nova_compute[226322]: 2026-01-26 10:10:20.752 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated VIF entry in instance network info cache for port a3edb0bf-fece-4486-bde7-200b47a3207f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:10:20 np0005595445 nova_compute[226322]: 2026-01-26 10:10:20.752 226326 DEBUG nova.network.neutron [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:10:20 np0005595445 nova_compute[226322]: 2026-01-26 10:10:20.771 226326 DEBUG oslo_concurrency.lockutils [req-356e8179-d484-43ca-8394-75bc85c19554 req-b5c4e284-279c-4714-baaf-46ff65eb7a3d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:10:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:21.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:21.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:10:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s#012Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 05:10:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:23.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:23.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:24 np0005595445 nova_compute[226322]: 2026-01-26 10:10:24.692 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:25 np0005595445 nova_compute[226322]: 2026-01-26 10:10:25.009 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:25.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:25.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:27 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:27.063 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:10:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:27.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:27.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:28 np0005595445 nova_compute[226322]: 2026-01-26 10:10:28.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:29.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:29.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:29 np0005595445 nova_compute[226322]: 2026-01-26 10:10:29.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:30 np0005595445 nova_compute[226322]: 2026-01-26 10:10:30.010 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:30 np0005595445 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:30 np0005595445 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:30 np0005595445 nova_compute[226322]: 2026-01-26 10:10:30.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:10:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:31.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:31.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:31 np0005595445 nova_compute[226322]: 2026-01-26 10:10:31.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:31 np0005595445 nova_compute[226322]: 2026-01-26 10:10:31.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:31 np0005595445 nova_compute[226322]: 2026-01-26 10:10:31.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:10:31 np0005595445 nova_compute[226322]: 2026-01-26 10:10:31.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:10:32 np0005595445 nova_compute[226322]: 2026-01-26 10:10:32.135 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:10:32 np0005595445 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:10:32 np0005595445 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 05:10:32 np0005595445 nova_compute[226322]: 2026-01-26 10:10:32.136 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:10:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:33.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:33.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:33 np0005595445 podman[232209]: 2026-01-26 10:10:33.362465196 +0000 UTC m=+0.139902691 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 26 05:10:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.625 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [{"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.640 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.640 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.641 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.660 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.661 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.661 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.662 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:10:33 np0005595445 nova_compute[226322]: 2026-01-26 10:10:33.662 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:10:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:10:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2211083446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.174 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.256 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.257 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.384 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4711MB free_disk=59.92194747924805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.385 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.471 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.522 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.696 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:10:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/577859797' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.972 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:10:34 np0005595445 nova_compute[226322]: 2026-01-26 10:10:34.979 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:10:35 np0005595445 nova_compute[226322]: 2026-01-26 10:10:35.000 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:10:35 np0005595445 nova_compute[226322]: 2026-01-26 10:10:35.011 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:35 np0005595445 nova_compute[226322]: 2026-01-26 10:10:35.025 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:10:35 np0005595445 nova_compute[226322]: 2026-01-26 10:10:35.025 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:10:35 np0005595445 nova_compute[226322]: 2026-01-26 10:10:35.071 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:10:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:35.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000052s ======
Jan 26 05:10:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:35.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Jan 26 05:10:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:37.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:37.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:39.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:39.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:39 np0005595445 nova_compute[226322]: 2026-01-26 10:10:39.698 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:40 np0005595445 nova_compute[226322]: 2026-01-26 10:10:40.013 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:41.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:43.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:44 np0005595445 nova_compute[226322]: 2026-01-26 10:10:44.700 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:45 np0005595445 nova_compute[226322]: 2026-01-26 10:10:45.015 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:45.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:45.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.000800) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247000823, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2365, "num_deletes": 251, "total_data_size": 6250910, "memory_usage": 6354824, "flush_reason": "Manual Compaction"}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247022013, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4034793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26008, "largest_seqno": 28367, "table_properties": {"data_size": 4025507, "index_size": 5780, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19611, "raw_average_key_size": 20, "raw_value_size": 4006696, "raw_average_value_size": 4134, "num_data_blocks": 254, "num_entries": 969, "num_filter_entries": 969, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422036, "oldest_key_time": 1769422036, "file_creation_time": 1769422246, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21326 microseconds, and 7878 cpu microseconds.
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.022122) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4034793 bytes OK
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.022165) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030049) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030067) EVENT_LOG_v1 {"time_micros": 1769422247030063, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.030084) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6240512, prev total WAL file size 6240512, number of live WAL files 2.
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.031674) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3940KB)], [51(12MB)]
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247031827, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16667635, "oldest_snapshot_seqno": -1}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5823 keys, 14554152 bytes, temperature: kUnknown
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247148627, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14554152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14514377, "index_size": 24113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 147983, "raw_average_key_size": 25, "raw_value_size": 14408282, "raw_average_value_size": 2474, "num_data_blocks": 984, "num_entries": 5823, "num_filter_entries": 5823, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422247, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.149008) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14554152 bytes
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.150383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.5 rd, 124.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6339, records dropped: 516 output_compression: NoCompression
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.150402) EVENT_LOG_v1 {"time_micros": 1769422247150393, "job": 30, "event": "compaction_finished", "compaction_time_micros": 116951, "compaction_time_cpu_micros": 53158, "output_level": 6, "num_output_files": 1, "total_output_size": 14554152, "num_input_records": 6339, "num_output_records": 5823, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247151592, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422247154415, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.031526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:10:47.154541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:10:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:47.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:47.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:47 np0005595445 podman[232312]: 2026-01-26 10:10:47.291745691 +0000 UTC m=+0.066122020 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:10:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:49.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:49.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:49 np0005595445 nova_compute[226322]: 2026-01-26 10:10:49.703 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:50 np0005595445 nova_compute[226322]: 2026-01-26 10:10:50.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:51.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:51.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:53.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.935 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:10:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:54 np0005595445 nova_compute[226322]: 2026-01-26 10:10:54.734 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:55 np0005595445 nova_compute[226322]: 2026-01-26 10:10:55.019 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:10:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:55.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:10:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:55.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:56 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.465827683 +0000 UTC m=+0.031420687 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.56684155 +0000 UTC m=+0.132434534 container create 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 05:10:56 np0005595445 systemd[1]: Started libpod-conmon-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope.
Jan 26 05:10:56 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.669294405 +0000 UTC m=+0.234887409 container init 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.681172487 +0000 UTC m=+0.246765471 container start 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.686174669 +0000 UTC m=+0.251767653 container attach 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 26 05:10:56 np0005595445 systemd[1]: libpod-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope: Deactivated successfully.
Jan 26 05:10:56 np0005595445 mystifying_chaum[232598]: 167 167
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.690345328 +0000 UTC m=+0.255938322 container died 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 05:10:56 np0005595445 conmon[232598]: conmon 340ee6cd798e2b37f658 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope/container/memory.events
Jan 26 05:10:56 np0005595445 systemd[1]: var-lib-containers-storage-overlay-a496413aa491a69d954bb122faa92dee8a50cf1ec283f3a879edccad5e1eb4bd-merged.mount: Deactivated successfully.
Jan 26 05:10:56 np0005595445 podman[232582]: 2026-01-26 10:10:56.733070782 +0000 UTC m=+0.298663766 container remove 340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=mystifying_chaum, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 05:10:56 np0005595445 systemd[1]: libpod-conmon-340ee6cd798e2b37f6584634f872fdceb3b8e8b15ef1001bd559aa2f92cdfc7c.scope: Deactivated successfully.
Jan 26 05:10:56 np0005595445 podman[232622]: 2026-01-26 10:10:56.93717642 +0000 UTC m=+0.063103480 container create c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 05:10:56 np0005595445 systemd[1]: Started libpod-conmon-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope.
Jan 26 05:10:57 np0005595445 podman[232622]: 2026-01-26 10:10:56.909444691 +0000 UTC m=+0.035371851 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 26 05:10:57 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:10:57 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 26 05:10:57 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 26 05:10:57 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 26 05:10:57 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 26 05:10:57 np0005595445 podman[232622]: 2026-01-26 10:10:57.033195386 +0000 UTC m=+0.159122486 container init c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 26 05:10:57 np0005595445 podman[232622]: 2026-01-26 10:10:57.039970624 +0000 UTC m=+0.165897694 container start c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 26 05:10:57 np0005595445 podman[232622]: 2026-01-26 10:10:57.043340993 +0000 UTC m=+0.169268083 container attach c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 26 05:10:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:57.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]: [
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:    {
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "available": false,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "being_replaced": false,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "ceph_device_lvm": false,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "lsm_data": {},
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "lvs": [],
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "path": "/dev/sr0",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "rejected_reasons": [
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "Has a FileSystem",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "Insufficient space (<5GB)"
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        ],
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        "sys_api": {
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "actuators": null,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "device_nodes": [
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:                "sr0"
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            ],
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "devname": "sr0",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "human_readable_size": "482.00 KB",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "id_bus": "ata",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "model": "QEMU DVD-ROM",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "nr_requests": "2",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "parent": "/dev/sr0",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "partitions": {},
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "path": "/dev/sr0",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "removable": "1",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "rev": "2.5+",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "ro": "0",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "rotational": "1",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "sas_address": "",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "sas_device_handle": "",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "scheduler_mode": "mq-deadline",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "sectors": 0,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "sectorsize": "2048",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "size": 493568.0,
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "support_discard": "2048",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "type": "disk",
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:            "vendor": "QEMU"
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:        }
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]:    }
Jan 26 05:10:57 np0005595445 goofy_lederberg[232638]: ]
Jan 26 05:10:57 np0005595445 systemd[1]: libpod-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope: Deactivated successfully.
Jan 26 05:10:57 np0005595445 podman[232622]: 2026-01-26 10:10:57.934841921 +0000 UTC m=+1.060768981 container died c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 26 05:10:58 np0005595445 systemd[1]: var-lib-containers-storage-overlay-a50efcbbac7411183286ce8649f8495c8917c72edcff13946694a5932b1ab55e-merged.mount: Deactivated successfully.
Jan 26 05:10:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:10:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:58 np0005595445 podman[232622]: 2026-01-26 10:10:58.641963001 +0000 UTC m=+1.767890061 container remove c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_lederberg, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 26 05:10:58 np0005595445 systemd[1]: libpod-conmon-c332c6308b59c020b68c7df7148137ea27f67c5d6be2aa9e4f16155cefe52c66.scope: Deactivated successfully.
Jan 26 05:10:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:10:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:10:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:10:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:10:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:10:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:10:59.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:10:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:10:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:10:59.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.397 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.398 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.398 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.399 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.399 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.400 226326 INFO nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Terminating instance#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.402 226326 DEBUG nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 05:10:59 np0005595445 kernel: tapa3edb0bf-fe (unregistering): left promiscuous mode
Jan 26 05:10:59 np0005595445 NetworkManager[49073]: <info>  [1769422259.4645] device (tapa3edb0bf-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 05:10:59 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:59Z|00044|binding|INFO|Releasing lport a3edb0bf-fece-4486-bde7-200b47a3207f from this chassis (sb_readonly=0)
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.475 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:59Z|00045|binding|INFO|Setting lport a3edb0bf-fece-4486-bde7-200b47a3207f down in Southbound
Jan 26 05:10:59 np0005595445 ovn_controller[133670]: 2026-01-26T10:10:59Z|00046|binding|INFO|Removing iface tapa3edb0bf-fe ovn-installed in OVS
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.478 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.485 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:05:f9 10.100.0.6'], port_security=['fa:16:3e:93:05:f9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'eafb89b3-bd76-4bd7-9ba0-c169e9e02d17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fcc6dd6-8be1-499a-bb83-b80d1cff1d47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83c6eda3-fd53-4ca0-bad4-44f3e56aac7f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=a3edb0bf-fece-4486-bde7-200b47a3207f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.487 143326 INFO neutron.agent.ovn.metadata.agent [-] Port a3edb0bf-fece-4486-bde7-200b47a3207f in datapath 586582bb-e1bc-4aa4-9785-243b8df2dbcb unbound from our chassis#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.489 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 586582bb-e1bc-4aa4-9785-243b8df2dbcb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.490 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe0008a-4af2-4ad5-8e15-1ad746308f21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.491 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb namespace which is not needed anymore#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.506 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 05:10:59 np0005595445 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.356s CPU time.
Jan 26 05:10:59 np0005595445 systemd-machined[194876]: Machine qemu-2-instance-00000004 terminated.
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : haproxy version is 2.8.14-c23fe91
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [NOTICE]   (232073) : path to executable is /usr/sbin/haproxy
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : Exiting Master process...
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : Exiting Master process...
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [ALERT]    (232073) : Current worker (232075) exited with code 143 (Terminated)
Jan 26 05:10:59 np0005595445 neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb[232069]: [WARNING]  (232073) : All workers exited. Exiting... (0)
Jan 26 05:10:59 np0005595445 systemd[1]: libpod-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope: Deactivated successfully.
Jan 26 05:10:59 np0005595445 podman[234033]: 2026-01-26 10:10:59.646961116 +0000 UTC m=+0.049708079 container died 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.646 226326 INFO nova.virt.libvirt.driver [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Instance destroyed successfully.#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.648 226326 DEBUG nova.objects.instance [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.668 226326 DEBUG nova.virt.libvirt.vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1372750228',display_name='tempest-TestNetworkBasicOps-server-1372750228',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1372750228',id=4,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMsvtjjj+6XI+gg1E4Xzo+HtbXxyF3peabH1GoeKNDwXTbi9v3JnK6KlG1vj7QQfwaNJGTtKqqAkDphs9wj+ke0O+NbixfA149oWC+Na1rLDGTUpAAOvTtxkuWhA6lVC4w==',key_name='tempest-TestNetworkBasicOps-796997991',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:09:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-318o5yar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:09:57Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=eafb89b3-bd76-4bd7-9ba0-c169e9e02d17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.669 226326 DEBUG nova.network.os_vif_util [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "a3edb0bf-fece-4486-bde7-200b47a3207f", "address": "fa:16:3e:93:05:f9", "network": {"id": "586582bb-e1bc-4aa4-9785-243b8df2dbcb", "bridge": "br-int", "label": "tempest-network-smoke--905467262", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3edb0bf-fe", "ovs_interfaceid": "a3edb0bf-fece-4486-bde7-200b47a3207f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.670 226326 DEBUG nova.network.os_vif_util [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.670 226326 DEBUG os_vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.673 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.673 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3edb0bf-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.676 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705-userdata-shm.mount: Deactivated successfully.
Jan 26 05:10:59 np0005595445 systemd[1]: var-lib-containers-storage-overlay-a3d3f80d88a6c2d2bff340e38e70bb3d56800b97343639f380b5d8e3058e1a4a-merged.mount: Deactivated successfully.
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.682 226326 INFO os_vif [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:05:f9,bridge_name='br-int',has_traffic_filtering=True,id=a3edb0bf-fece-4486-bde7-200b47a3207f,network=Network(586582bb-e1bc-4aa4-9785-243b8df2dbcb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3edb0bf-fe')#033[00m
Jan 26 05:10:59 np0005595445 podman[234033]: 2026-01-26 10:10:59.693939681 +0000 UTC m=+0.096686644 container cleanup 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:10:59 np0005595445 systemd[1]: libpod-conmon-6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705.scope: Deactivated successfully.
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.736 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 podman[234089]: 2026-01-26 10:10:59.754810212 +0000 UTC m=+0.041700298 container remove 6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.760 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[043cc56d-10f0-40bd-b4f0-33709d550f90]: (4, ('Mon Jan 26 10:10:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb (6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705)\n6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705\nMon Jan 26 10:10:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb (6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705)\n6ac07cbfd1aa0cb91de505161fd280fcad13a630db2783f5c11f4cff321ee705\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.762 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f60603-906f-4fd0-ad78-c49e3e03e73e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.763 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap586582bb-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.764 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 kernel: tap586582bb-e0: left promiscuous mode
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:10:59 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.782 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a7bfdc-8da5-4f03-9396-891b3119744a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.797 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[f8653c71-2074-4386-a1f4-4ef029cb1c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.798 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[2522c824-0720-4e69-a078-49f421f6b4e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.812 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[629acbac-baec-4532-a555-4dfdc983b124]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415715, 'reachable_time': 37047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234108, 'error': None, 'target': 'ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.815 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-586582bb-e1bc-4aa4-9785-243b8df2dbcb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 05:10:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:10:59.816 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[244f096f-8ad6-44aa-9784-4adb4fc9aa54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:10:59 np0005595445 systemd[1]: run-netns-ovnmeta\x2d586582bb\x2de1bc\x2d4aa4\x2d9785\x2d243b8df2dbcb.mount: Deactivated successfully.
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.922 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.922 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG oslo_concurrency.lockutils [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.923 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:10:59 np0005595445 nova_compute[226322]: 2026-01-26 10:10:59.924 226326 DEBUG nova.compute.manager [req-f7b97531-55f7-4884-9752-aba67056ded4 req-6f1a8d97-d97d-4c64-96b7-8cc98b2ee7ef b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-unplugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.130 226326 INFO nova.virt.libvirt.driver [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deleting instance files /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_del#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.130 226326 INFO nova.virt.libvirt.driver [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deletion of /var/lib/nova/instances/eafb89b3-bd76-4bd7-9ba0-c169e9e02d17_del complete#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.181 226326 INFO nova.compute.manager [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.181 226326 DEBUG oslo.service.loopingcall [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.182 226326 DEBUG nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 05:11:00 np0005595445 nova_compute[226322]: 2026-01-26 10:11:00.182 226326 DEBUG nova.network.neutron [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 05:11:01 np0005595445 nova_compute[226322]: 2026-01-26 10:11:01.111 226326 DEBUG nova.network.neutron [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:11:01 np0005595445 nova_compute[226322]: 2026-01-26 10:11:01.129 226326 INFO nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Took 0.95 seconds to deallocate network for instance.#033[00m
Jan 26 05:11:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:01.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:01.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:01 np0005595445 nova_compute[226322]: 2026-01-26 10:11:01.332 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:11:01 np0005595445 nova_compute[226322]: 2026-01-26 10:11:01.332 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:11:01 np0005595445 nova_compute[226322]: 2026-01-26 10:11:01.607 226326 DEBUG oslo_concurrency.processutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.067 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.068 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.068 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.069 226326 DEBUG oslo_concurrency.lockutils [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.069 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] No waiting events found dispatching network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.070 226326 WARNING nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received unexpected event network-vif-plugged-a3edb0bf-fece-4486-bde7-200b47a3207f for instance with vm_state deleted and task_state None.#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.070 226326 DEBUG nova.compute.manager [req-31c6a930-95af-4209-862d-8a5fc4c9c5f6 req-12ef8123-4b45-4d45-b567-2286c64009cb b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Received event network-vif-deleted-a3edb0bf-fece-4486-bde7-200b47a3207f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:11:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:11:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3970104987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.120 226326 DEBUG oslo_concurrency.processutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.128 226326 DEBUG nova.compute.provider_tree [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.143 226326 DEBUG nova.scheduler.client.report [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.174 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.219 226326 INFO nova.scheduler.client.report [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance eafb89b3-bd76-4bd7-9ba0-c169e9e02d17#033[00m
Jan 26 05:11:02 np0005595445 nova_compute[226322]: 2026-01-26 10:11:02.286 226326 DEBUG oslo_concurrency.lockutils [None req-2825401a-f6fa-4331-a8bc-40c8f9d2fa70 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "eafb89b3-bd76-4bd7-9ba0-c169e9e02d17" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:03.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:03.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:03 np0005595445 podman[234161]: 2026-01-26 10:11:03.754067325 +0000 UTC m=+0.101636699 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:11:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:04 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:11:04 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:11:04 np0005595445 nova_compute[226322]: 2026-01-26 10:11:04.676 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:04 np0005595445 nova_compute[226322]: 2026-01-26 10:11:04.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:05.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:05.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:11:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5443 writes, 28K keys, 5443 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5443 writes, 5443 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1537 writes, 7358 keys, 1537 commit groups, 1.0 writes per commit group, ingest: 17.15 MB, 0.03 MB/s#012Interval WAL: 1537 writes, 1537 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     83.4      0.53              0.14        15    0.035       0      0       0.0       0.0#012  L6      1/0   13.88 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.1    150.2    128.7      1.39              0.42        14    0.099     73K   7423       0.0       0.0#012 Sum      1/0   13.88 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.1    109.0    116.3      1.92              0.55        29    0.066     73K   7423       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7     94.2     96.0      0.79              0.20        10    0.079     29K   2577       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    150.2    128.7      1.39              0.42        14    0.099     73K   7423       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     83.8      0.52              0.14        14    0.037       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.043, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.22 GB write, 0.12 MB/s write, 0.20 GB read, 0.12 MB/s read, 1.9 seconds#012Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 17.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000131 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(916,16.53 MB,5.43732%) FilterBlock(29,217.11 KB,0.0697437%) IndexBlock(29,387.08 KB,0.124344%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 05:11:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:07.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:07.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:08 np0005595445 nova_compute[226322]: 2026-01-26 10:11:08.895 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:08 np0005595445 nova_compute[226322]: 2026-01-26 10:11:08.981 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:09.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:09 np0005595445 nova_compute[226322]: 2026-01-26 10:11:09.679 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:09 np0005595445 nova_compute[226322]: 2026-01-26 10:11:09.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:11.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:13.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:13.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:14 np0005595445 nova_compute[226322]: 2026-01-26 10:11:14.644 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422259.6430793, eafb89b3-bd76-4bd7-9ba0-c169e9e02d17 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:11:14 np0005595445 nova_compute[226322]: 2026-01-26 10:11:14.645 226326 INFO nova.compute.manager [-] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] VM Stopped (Lifecycle Event)#033[00m
Jan 26 05:11:14 np0005595445 nova_compute[226322]: 2026-01-26 10:11:14.661 226326 DEBUG nova.compute.manager [None req-60fa603f-98a4-42e2-9ad8-ee1dff2adb4c - - - - - -] [instance: eafb89b3-bd76-4bd7-9ba0-c169e9e02d17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:11:14 np0005595445 nova_compute[226322]: 2026-01-26 10:11:14.682 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:14 np0005595445 nova_compute[226322]: 2026-01-26 10:11:14.741 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:15.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:15.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:17.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:17.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:18 np0005595445 podman[234229]: 2026-01-26 10:11:18.297004382 +0000 UTC m=+0.062639612 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:11:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:19.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:19 np0005595445 nova_compute[226322]: 2026-01-26 10:11:19.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:19 np0005595445 nova_compute[226322]: 2026-01-26 10:11:19.744 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:20 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:20.055 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:11:20 np0005595445 nova_compute[226322]: 2026-01-26 10:11:20.056 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:20 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:20.056 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:11:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:21.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:23.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:24 np0005595445 nova_compute[226322]: 2026-01-26 10:11:24.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:24 np0005595445 nova_compute[226322]: 2026-01-26 10:11:24.748 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:25.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:25 np0005595445 nova_compute[226322]: 2026-01-26 10:11:25.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:27.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:27.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:28 np0005595445 nova_compute[226322]: 2026-01-26 10:11:28.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:29 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:29.058 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:11:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:29.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.703 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.703 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.704 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 05:11:29 np0005595445 nova_compute[226322]: 2026-01-26 10:11:29.749 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:30 np0005595445 nova_compute[226322]: 2026-01-26 10:11:30.726 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:11:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:31.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:11:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:31.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:32 np0005595445 nova_compute[226322]: 2026-01-26 10:11:32.702 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:11:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:33.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:33.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:33 np0005595445 nova_compute[226322]: 2026-01-26 10:11:33.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:33 np0005595445 nova_compute[226322]: 2026-01-26 10:11:33.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:34 np0005595445 podman[234281]: 2026-01-26 10:11:34.296170875 +0000 UTC m=+0.075617098 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 05:11:34 np0005595445 nova_compute[226322]: 2026-01-26 10:11:34.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:34 np0005595445 nova_compute[226322]: 2026-01-26 10:11:34.751 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.154 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.190 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.191 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.191 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:11:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:35.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:35.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.663 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.816 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.817 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4913MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.818 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.818 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.944 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:11:35 np0005595445 nova_compute[226322]: 2026-01-26 10:11:35.944 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.004 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:11:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:11:36 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3663071606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.657 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.653s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.662 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.686 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.707 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:11:36 np0005595445 nova_compute[226322]: 2026-01-26 10:11:36.707 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:37.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:39.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:39 np0005595445 nova_compute[226322]: 2026-01-26 10:11:39.693 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:39 np0005595445 nova_compute[226322]: 2026-01-26 10:11:39.752 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:41.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:41.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:43.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:43.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:45.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:45.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:46 np0005595445 nova_compute[226322]: 2026-01-26 10:11:44.695 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:46 np0005595445 nova_compute[226322]: 2026-01-26 10:11:44.753 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:47.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:47.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:47 np0005595445 ovn_controller[133670]: 2026-01-26T10:11:47Z|00047|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 05:11:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:49 np0005595445 nova_compute[226322]: 2026-01-26 10:11:49.199 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:11:49 np0005595445 podman[234389]: 2026-01-26 10:11:49.265512925 +0000 UTC m=+0.044424925 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 26 05:11:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:49.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:49 np0005595445 nova_compute[226322]: 2026-01-26 10:11:49.697 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:49 np0005595445 nova_compute[226322]: 2026-01-26 10:11:49.755 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:51.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:51.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:53.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:53.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:11:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:11:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:11:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:11:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:54 np0005595445 nova_compute[226322]: 2026-01-26 10:11:54.699 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:54 np0005595445 nova_compute[226322]: 2026-01-26 10:11:54.756 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:55.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:55.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:56 np0005595445 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 05:11:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:57.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:11:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:57.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:11:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:11:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:11:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:11:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:11:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:11:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:11:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:11:59.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:11:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:11:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:11:59.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:11:59 np0005595445 nova_compute[226322]: 2026-01-26 10:11:59.701 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:11:59 np0005595445 nova_compute[226322]: 2026-01-26 10:11:59.757 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:01.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:01.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:03.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:04 np0005595445 nova_compute[226322]: 2026-01-26 10:12:04.703 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:04 np0005595445 nova_compute[226322]: 2026-01-26 10:12:04.762 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:12:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:05.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:12:05 np0005595445 podman[234529]: 2026-01-26 10:12:05.354211985 +0000 UTC m=+0.127184297 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 05:12:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:12:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:05.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:12:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:07.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:07.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:09.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:09.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:12:09 np0005595445 nova_compute[226322]: 2026-01-26 10:12:09.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:09 np0005595445 nova_compute[226322]: 2026-01-26 10:12:09.765 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:11.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:11.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:13.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:13.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:13 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:14 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:14 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:12:14 np0005595445 nova_compute[226322]: 2026-01-26 10:12:14.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:14 np0005595445 nova_compute[226322]: 2026-01-26 10:12:14.766 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:15.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:15.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:17.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:18 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:19.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 26 05:12:19 np0005595445 nova_compute[226322]: 2026-01-26 10:12:19.736 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:19 np0005595445 nova_compute[226322]: 2026-01-26 10:12:19.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:20 np0005595445 podman[234587]: 2026-01-26 10:12:20.273889273 +0000 UTC m=+0.053848507 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 05:12:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:12:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:12:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:21 np0005595445 nova_compute[226322]: 2026-01-26 10:12:21.616 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:21 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:21.616 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:12:21 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:21.617 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:12:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:23.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:23 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:24 np0005595445 nova_compute[226322]: 2026-01-26 10:12:24.738 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:24 np0005595445 nova_compute[226322]: 2026-01-26 10:12:24.771 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:25.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.472 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.473 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.488 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.561 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.562 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.569 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.570 226326 INFO nova.compute.claims [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 26 05:12:26 np0005595445 nova_compute[226322]: 2026-01-26 10:12:26.721 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:12:27 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/981306923' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.267 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.274 226326 DEBUG nova.compute.provider_tree [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.288 226326 DEBUG nova.scheduler.client.report [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.314 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.315 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 05:12:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:12:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:27.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.368 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.368 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.390 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.417 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 05:12:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.554 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.555 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.555 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating image(s)#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.579 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.608 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.641 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.645 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.704 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.705 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.706 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.706 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.733 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:27 np0005595445 nova_compute[226322]: 2026-01-26 10:12:27.739 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 17d96116-83a4-40d0-9dcd-bd5072238621_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.045 226326 DEBUG nova.policy [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.111 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 17d96116-83a4-40d0-9dcd-bd5072238621_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.372s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.173 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.359 226326 DEBUG nova.objects.instance [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.388 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.388 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Ensure instance console log exists: /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.389 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:28 np0005595445 nova_compute[226322]: 2026-01-26 10:12:28.802 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:29.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:29 np0005595445 nova_compute[226322]: 2026-01-26 10:12:29.740 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:29 np0005595445 nova_compute[226322]: 2026-01-26 10:12:29.773 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:30 np0005595445 nova_compute[226322]: 2026-01-26 10:12:30.145 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Successfully created port: 93f5d287-34ec-424d-8776-df002083762e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 05:12:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:31.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:12:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:12:31 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:31.619 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:31 np0005595445 nova_compute[226322]: 2026-01-26 10:12:31.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.234 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Successfully updated port: 93f5d287-34ec-424d-8776-df002083762e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.250 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.251 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.251 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.438 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.459 226326 DEBUG nova.compute.manager [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-changed-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.460 226326 DEBUG nova.compute.manager [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Refreshing instance network info cache due to event network-changed-93f5d287-34ec-424d-8776-df002083762e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.460 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:32 np0005595445 nova_compute[226322]: 2026-01-26 10:12:32.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:33.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:33.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.595 226326 DEBUG nova.network.neutron [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.637 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.638 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance network_info: |[{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.638 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.639 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Refreshing network info cache for port 93f5d287-34ec-424d-8776-df002083762e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.641 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start _get_guest_xml network_info=[{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.646 226326 WARNING nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.661 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.662 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.667 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.668 226326 DEBUG nova.virt.libvirt.host [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.668 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.669 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.670 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.671 226326 DEBUG nova.virt.hardware [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.674 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.692 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.693 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.843 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.844 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.845 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:33 np0005595445 nova_compute[226322]: 2026-01-26 10:12:33.846 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:12:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:12:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2307260624' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.211 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.244 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.249 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:12:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3132034729' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.686 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.688 226326 DEBUG nova.virt.libvirt.vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:12:27Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.688 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.689 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.691 226326 DEBUG nova.objects.instance [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.726 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] End _get_guest_xml xml=<domain type="kvm">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <uuid>17d96116-83a4-40d0-9dcd-bd5072238621</uuid>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <name>instance-00000007</name>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <memory>131072</memory>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <vcpu>1</vcpu>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <metadata>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:name>tempest-TestNetworkBasicOps-server-200115615</nova:name>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:creationTime>2026-01-26 10:12:33</nova:creationTime>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:flavor name="m1.nano">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:memory>128</nova:memory>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:disk>1</nova:disk>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:swap>0</nova:swap>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:vcpus>1</nova:vcpus>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </nova:flavor>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:owner>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </nova:owner>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <nova:ports>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <nova:port uuid="93f5d287-34ec-424d-8776-df002083762e">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        </nova:port>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </nova:ports>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </nova:instance>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </metadata>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <sysinfo type="smbios">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <system>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="manufacturer">RDO</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="product">OpenStack Compute</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="serial">17d96116-83a4-40d0-9dcd-bd5072238621</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="uuid">17d96116-83a4-40d0-9dcd-bd5072238621</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <entry name="family">Virtual Machine</entry>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </system>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </sysinfo>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <os>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <boot dev="hd"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <smbios mode="sysinfo"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <acpi/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <apic/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <vmcoreinfo/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <clock offset="utc">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <timer name="hpet" present="no"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </clock>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <cpu mode="host-model" match="exact">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <disk type="network" device="disk">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/17d96116-83a4-40d0-9dcd-bd5072238621_disk">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <target dev="vda" bus="virtio"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <disk type="network" device="cdrom">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/17d96116-83a4-40d0-9dcd-bd5072238621_disk.config">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <target dev="sda" bus="sata"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <interface type="ethernet">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <mac address="fa:16:3e:00:c7:32"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <mtu size="1442"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <target dev="tap93f5d287-34"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <serial type="pty">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <log file="/var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/console.log" append="off"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </serial>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <video>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <input type="tablet" bus="usb"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <rng model="virtio">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <backend model="random">/dev/urandom</backend>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <controller type="usb" index="0"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    <memballoon model="virtio">
Jan 26 05:12:34 np0005595445 nova_compute[226322]:      <stats period="10"/>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:    </memballoon>
Jan 26 05:12:34 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:12:34 np0005595445 nova_compute[226322]: </domain>
Jan 26 05:12:34 np0005595445 nova_compute[226322]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.727 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Preparing to wait for external event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.727 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.728 226326 DEBUG nova.virt.libvirt.vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:12:27Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.729 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.729 226326 DEBUG nova.network.os_vif_util [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.730 226326 DEBUG os_vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.730 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.731 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.731 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93f5d287-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.735 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93f5d287-34, col_values=(('external_ids', {'iface-id': '93f5d287-34ec-424d-8776-df002083762e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:c7:32', 'vm-uuid': '17d96116-83a4-40d0-9dcd-bd5072238621'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:34 np0005595445 NetworkManager[49073]: <info>  [1769422354.7393] manager: (tap93f5d287-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.745 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.747 226326 INFO os_vif [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34')#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.774 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.823 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.823 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.824 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:00:c7:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.824 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Using config drive#033[00m
Jan 26 05:12:34 np0005595445 nova_compute[226322]: 2026-01-26 10:12:34.863 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:35.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:35 np0005595445 nova_compute[226322]: 2026-01-26 10:12:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:35 np0005595445 nova_compute[226322]: 2026-01-26 10:12:35.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:12:35 np0005595445 nova_compute[226322]: 2026-01-26 10:12:35.963 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updated VIF entry in instance network info cache for port 93f5d287-34ec-424d-8776-df002083762e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:12:35 np0005595445 nova_compute[226322]: 2026-01-26 10:12:35.963 226326 DEBUG nova.network.neutron [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [{"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:12:36 np0005595445 podman[234914]: 2026-01-26 10:12:36.332143102 +0000 UTC m=+0.101809412 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.344 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.344 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.345 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.373 226326 DEBUG oslo_concurrency.lockutils [req-c600f5a7-12f3-4785-b9bb-0f9ceac097d2 req-d2ea289b-91f1-4253-a9f7-4aa3ee7a8eed b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-17d96116-83a4-40d0-9dcd-bd5072238621" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:12:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:12:36 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3576426832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.812 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.868 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:12:36 np0005595445 nova_compute[226322]: 2026-01-26 10:12:36.868 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.013 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4826MB free_disk=59.921871185302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.014 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.108 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance 17d96116-83a4-40d0-9dcd-bd5072238621 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.109 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.109 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.127 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.149 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.150 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.166 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.184 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.216 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:37.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:12:37 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1035487032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.644 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.650 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.676 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.810 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:12:37 np0005595445 nova_compute[226322]: 2026-01-26 10:12:37.811 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.039 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Creating config drive at /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.045 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4941ruca execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.173 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4941ruca" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.204 226326 DEBUG nova.storage.rbd_utils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.211 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.461 226326 DEBUG oslo_concurrency.processutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config 17d96116-83a4-40d0-9dcd-bd5072238621_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.250s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.462 226326 INFO nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deleting local config drive /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621/disk.config because it was imported into RBD.#033[00m
Jan 26 05:12:38 np0005595445 systemd[1]: Starting libvirt secret daemon...
Jan 26 05:12:38 np0005595445 systemd[1]: Started libvirt secret daemon.
Jan 26 05:12:38 np0005595445 kernel: tap93f5d287-34: entered promiscuous mode
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.5543] manager: (tap93f5d287-34): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 05:12:38 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:38Z|00048|binding|INFO|Claiming lport 93f5d287-34ec-424d-8776-df002083762e for this chassis.
Jan 26 05:12:38 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:38Z|00049|binding|INFO|93f5d287-34ec-424d-8776-df002083762e: Claiming fa:16:3e:00:c7:32 10.100.0.27
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.566 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:c7:32 10.100.0.27'], port_security=['fa:16:3e:00:c7:32 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '17d96116-83a4-40d0-9dcd-bd5072238621', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b040f826-b5f4-4627-aa36-62cc16e0727f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=73bcc0f9-41ce-47a1-86a1-53fe1b73bb31, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=93f5d287-34ec-424d-8776-df002083762e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.567 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 93f5d287-34ec-424d-8776-df002083762e in datapath ae1cb66c-0987-4156-9bdb-cb2a08957306 bound to our chassis#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.569 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae1cb66c-0987-4156-9bdb-cb2a08957306#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.579 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[9761b12b-3f43-49ca-8e22-96f5049eda59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.579 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae1cb66c-01 in ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.581 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae1cb66c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.581 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[94d46c7d-9afb-4ce8-a066-d387fe0d1d10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.583 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[126a6126-52d3-444d-893e-6176b0ec86b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 systemd-udevd[235059]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.594 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ca7631-4f9f-48cf-b9c3-0b42ac58ed78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 systemd-machined[194876]: New machine qemu-3-instance-00000007.
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.598 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.6048] device (tap93f5d287-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.6059] device (tap93f5d287-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.605 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:38Z|00050|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e ovn-installed in OVS
Jan 26 05:12:38 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:38Z|00051|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e up in Southbound
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.607 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.618 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d68a26a7-afe2-4dde-b27e-1efeb027b076]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.646 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[4a661095-829e-4f7a-a19a-8738785eee4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.652 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0f12ea25-d8e3-48b7-8e16-456ca9b59e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.6541] manager: (tapae1cb66c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.678 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[48bf061d-52a7-4446-83e2-c1df9248b330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.681 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[d51a5050-3321-4852-b87c-82754a6e424e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.6992] device (tapae1cb66c-00): carrier: link connected
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.703 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[8e845334-24ee-415f-b949-9f1084d8fde5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.720 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[633f0268-fcda-4fea-a44f-0dc1586e74ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae1cb66c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:97:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431926, 'reachable_time': 27698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235090, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.732 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e192661e-311f-4911-a6a9-1f12bd428398]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:9770'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431926, 'tstamp': 431926}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235091, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.745 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9fc857-dfcb-4c45-b1ec-49b792daa337]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae1cb66c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:97:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431926, 'reachable_time': 27698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235092, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.770 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[ece01b40-e012-4877-a327-ae8e5503bb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.831 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[71459863-83c7-4544-bff5-dd6e0904fa0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.832 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae1cb66c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.833 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.833 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae1cb66c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.835 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 NetworkManager[49073]: <info>  [1769422358.8364] manager: (tapae1cb66c-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 05:12:38 np0005595445 kernel: tapae1cb66c-00: entered promiscuous mode
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.839 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.840 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae1cb66c-00, col_values=(('external_ids', {'iface-id': 'eff5217a-1c96-40b0-bc7b-e1d3937349a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:12:38 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:38Z|00052|binding|INFO|Releasing lport eff5217a-1c96-40b0-bc7b-e1d3937349a6 from this chassis (sb_readonly=0)
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 nova_compute[226322]: 2026-01-26 10:12:38.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.869 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.870 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7f018f-0675-46de-9496-350b0537ec72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.871 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: global
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    log         /dev/log local0 debug
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    log-tag     haproxy-metadata-proxy-ae1cb66c-0987-4156-9bdb-cb2a08957306
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    user        root
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    group       root
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    maxconn     1024
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    pidfile     /var/lib/neutron/external/pids/ae1cb66c-0987-4156-9bdb-cb2a08957306.pid.haproxy
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    daemon
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: defaults
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    log global
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    mode http
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    option httplog
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    option dontlognull
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    option http-server-close
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    option forwardfor
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    retries                 3
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    timeout http-request    30s
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    timeout connect         30s
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    timeout client          32s
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    timeout server          32s
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    timeout http-keep-alive 30s
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: listen listener
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    bind 169.254.169.254:80
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]:    http-request add-header X-OVN-Network-ID ae1cb66c-0987-4156-9bdb-cb2a08957306
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 05:12:38 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:38.871 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'env', 'PROCESS_TAG=haproxy-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae1cb66c-0987-4156-9bdb-cb2a08957306.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 05:12:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:39 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.202 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422359.202343, 17d96116-83a4-40d0-9dcd-bd5072238621 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.203 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Started (Lifecycle Event)#033[00m
Jan 26 05:12:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:39.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:39 np0005595445 podman[235166]: 2026-01-26 10:12:39.279586989 +0000 UTC m=+0.019813616 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.383 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.388 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422359.2032652, 17d96116-83a4-40d0-9dcd-bd5072238621 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.388 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Paused (Lifecycle Event)#033[00m
Jan 26 05:12:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:39.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:39 np0005595445 nova_compute[226322]: 2026-01-26 10:12:39.777 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:41.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:41.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:41 np0005595445 podman[235166]: 2026-01-26 10:12:41.4825926 +0000 UTC m=+2.222819207 container create 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 26 05:12:42 np0005595445 systemd[1]: Started libpod-conmon-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope.
Jan 26 05:12:42 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:12:42 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4db7298031f92053143f67de5fae7608006766babe131c36a52e198214e577e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 05:12:42 np0005595445 podman[235166]: 2026-01-26 10:12:42.416235638 +0000 UTC m=+3.156462265 container init 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:12:42 np0005595445 podman[235166]: 2026-01-26 10:12:42.422202498 +0000 UTC m=+3.162429125 container start 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 26 05:12:42 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : New worker (235189) forked
Jan 26 05:12:42 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : Loading success.
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.471 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.474 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.537 226326 DEBUG nova.compute.manager [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.538 226326 DEBUG oslo_concurrency.lockutils [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.539 226326 DEBUG nova.compute.manager [req-17b4c832-3f09-492e-b91e-bf50a5c874b4 req-6c2353ff-70ba-4795-a673-cc8d6f07bbe0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Processing event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.539 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.550 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.553 226326 INFO nova.virt.libvirt.driver [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance spawned successfully.#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.554 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.575 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.576 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422362.5438752, 17d96116-83a4-40d0-9dcd-bd5072238621 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.576 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Resumed (Lifecycle Event)#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.643 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.648 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.649 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.650 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.650 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.651 226326 DEBUG nova.virt.libvirt.driver [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.655 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.659 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.696 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.770 226326 INFO nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 15.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 05:12:42 np0005595445 nova_compute[226322]: 2026-01-26 10:12:42.771 226326 DEBUG nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:12:43 np0005595445 nova_compute[226322]: 2026-01-26 10:12:43.063 226326 INFO nova.compute.manager [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 16.53 seconds to build instance.#033[00m
Jan 26 05:12:43 np0005595445 nova_compute[226322]: 2026-01-26 10:12:43.185 226326 DEBUG oslo_concurrency.lockutils [None req-d580db12-d168-4035-87d3-6c0756bcbf0c c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:43.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:43.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:44 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:44 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.642 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 DEBUG oslo_concurrency.lockutils [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 DEBUG nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.643 226326 WARNING nova.compute.manager [req-66f1babd-ff8a-4527-8bf2-b6d519e66785 req-9a1394a4-7c30-4679-976a-bc00194b2799 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received unexpected event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e for instance with vm_state active and task_state None.#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.739 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:44 np0005595445 nova_compute[226322]: 2026-01-26 10:12:44.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:45.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:45.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:47.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:47.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:49 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:49 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:12:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:49.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:12:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:49.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:49 np0005595445 nova_compute[226322]: 2026-01-26 10:12:49.741 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:49 np0005595445 nova_compute[226322]: 2026-01-26 10:12:49.780 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [WARNING] 025/101250 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 26 05:12:50 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf[86629]: [ALERT] 025/101250 (4) : backend 'backend' has no server available!
Jan 26 05:12:51 np0005595445 podman[235229]: 2026-01-26 10:12:51.335303744 +0000 UTC m=+0.106163466 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:12:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:51.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:51.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:53.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.936 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:12:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:12:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:12:53.939 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:12:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:54 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:54 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:12:54 np0005595445 nova_compute[226322]: 2026-01-26 10:12:54.743 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:54 np0005595445 nova_compute[226322]: 2026-01-26 10:12:54.782 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:55.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:55.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:56 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:56 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:12:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:57.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:12:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:57.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:12:57 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:57Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:c7:32 10.100.0.27
Jan 26 05:12:57 np0005595445 ovn_controller[133670]: 2026-01-26T10:12:57Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:c7:32 10.100.0.27
Jan 26 05:12:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:12:59.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:12:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:12:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:12:59.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:12:59 np0005595445 nova_compute[226322]: 2026-01-26 10:12:59.745 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:59 np0005595445 nova_compute[226322]: 2026-01-26 10:12:59.783 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:12:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:12:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:12:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:12:59 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:12:59 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:01.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:13:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:01.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:13:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:03.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:03.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:04 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:04 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:04 np0005595445 nova_compute[226322]: 2026-01-26 10:13:04.748 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:04 np0005595445 nova_compute[226322]: 2026-01-26 10:13:04.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:13:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:05.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:13:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:05.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:07 np0005595445 podman[235286]: 2026-01-26 10:13:07.291163167 +0000 UTC m=+0.090285314 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 05:13:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:07.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:07.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.642287) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388642312, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1703, "num_deletes": 255, "total_data_size": 4470135, "memory_usage": 4524432, "flush_reason": "Manual Compaction"}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388667867, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2876864, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28372, "largest_seqno": 30070, "table_properties": {"data_size": 2869763, "index_size": 4108, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14635, "raw_average_key_size": 19, "raw_value_size": 2855521, "raw_average_value_size": 3807, "num_data_blocks": 180, "num_entries": 750, "num_filter_entries": 750, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422248, "oldest_key_time": 1769422248, "file_creation_time": 1769422388, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 25620 microseconds, and 6958 cpu microseconds.
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.667903) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2876864 bytes OK
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.667926) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669307) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669319) EVENT_LOG_v1 {"time_micros": 1769422388669315, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.669581) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4462339, prev total WAL file size 4462339, number of live WAL files 2.
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.670863) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373532' seq:0, type:0; will stop at (end)
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2809KB)], [54(13MB)]
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388670983, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17431016, "oldest_snapshot_seqno": -1}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6045 keys, 17284848 bytes, temperature: kUnknown
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388814073, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17284848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17240868, "index_size": 27742, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 153717, "raw_average_key_size": 25, "raw_value_size": 17128288, "raw_average_value_size": 2833, "num_data_blocks": 1138, "num_entries": 6045, "num_filter_entries": 6045, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422388, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.814410) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17284848 bytes
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.816253) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.7 rd, 120.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 13.9 +0.0 blob) out(16.5 +0.0 blob), read-write-amplify(12.1) write-amplify(6.0) OK, records in: 6573, records dropped: 528 output_compression: NoCompression
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.816278) EVENT_LOG_v1 {"time_micros": 1769422388816265, "job": 32, "event": "compaction_finished", "compaction_time_micros": 143242, "compaction_time_cpu_micros": 60715, "output_level": 6, "num_output_files": 1, "total_output_size": 17284848, "num_input_records": 6573, "num_output_records": 6045, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388816888, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422388819421, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.670731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:13:08.819552) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:13:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:09 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:09 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:09.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:09 np0005595445 nova_compute[226322]: 2026-01-26 10:13:09.750 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:09 np0005595445 nova_compute[226322]: 2026-01-26 10:13:09.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.058 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.058 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.059 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.061 226326 INFO nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Terminating instance#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.062 226326 DEBUG nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 05:13:11 np0005595445 kernel: tap93f5d287-34 (unregistering): left promiscuous mode
Jan 26 05:13:11 np0005595445 NetworkManager[49073]: <info>  [1769422391.1203] device (tap93f5d287-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.129 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 ovn_controller[133670]: 2026-01-26T10:13:11Z|00053|binding|INFO|Releasing lport 93f5d287-34ec-424d-8776-df002083762e from this chassis (sb_readonly=0)
Jan 26 05:13:11 np0005595445 ovn_controller[133670]: 2026-01-26T10:13:11Z|00054|binding|INFO|Setting lport 93f5d287-34ec-424d-8776-df002083762e down in Southbound
Jan 26 05:13:11 np0005595445 ovn_controller[133670]: 2026-01-26T10:13:11Z|00055|binding|INFO|Removing iface tap93f5d287-34 ovn-installed in OVS
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.133 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.137 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:c7:32 10.100.0.27'], port_security=['fa:16:3e:00:c7:32 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '17d96116-83a4-40d0-9dcd-bd5072238621', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b040f826-b5f4-4627-aa36-62cc16e0727f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=73bcc0f9-41ce-47a1-86a1-53fe1b73bb31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=93f5d287-34ec-424d-8776-df002083762e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.138 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 93f5d287-34ec-424d-8776-df002083762e in datapath ae1cb66c-0987-4156-9bdb-cb2a08957306 unbound from our chassis#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.139 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae1cb66c-0987-4156-9bdb-cb2a08957306, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.142 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[f77db2ff-4906-4e3c-a106-1186c31bab7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.143 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 namespace which is not needed anymore#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.150 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 26 05:13:11 np0005595445 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 14.372s CPU time.
Jan 26 05:13:11 np0005595445 systemd-machined[194876]: Machine qemu-3-instance-00000007 terminated.
Jan 26 05:13:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.299 226326 INFO nova.virt.libvirt.driver [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Instance destroyed successfully.#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.299 226326 DEBUG nova.objects.instance [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid 17d96116-83a4-40d0-9dcd-bd5072238621 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:13:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:11.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:11 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : haproxy version is 2.8.14-c23fe91
Jan 26 05:13:11 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [NOTICE]   (235187) : path to executable is /usr/sbin/haproxy
Jan 26 05:13:11 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [WARNING]  (235187) : Exiting Master process...
Jan 26 05:13:11 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [ALERT]    (235187) : Current worker (235189) exited with code 143 (Terminated)
Jan 26 05:13:11 np0005595445 neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306[235183]: [WARNING]  (235187) : All workers exited. Exiting... (0)
Jan 26 05:13:11 np0005595445 systemd[1]: libpod-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope: Deactivated successfully.
Jan 26 05:13:11 np0005595445 podman[235344]: 2026-01-26 10:13:11.651549724 +0000 UTC m=+0.406588575 container died 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.693 226326 DEBUG nova.virt.libvirt.vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:12:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-200115615',display_name='tempest-TestNetworkBasicOps-server-200115615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-200115615',id=7,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNdcGSrtCXDvk89I0Vz9B7It1mSwlDvkZdMdWLCvoVbAe00VgK1axvS1LkIm+2Wq3uLZqcGzySpB1p+E5CAX8FNwfd40H16DcgIFt/DJC5r2xLViVsjIsqcjjLML0XecLw==',key_name='tempest-TestNetworkBasicOps-1792120849',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:12:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-0x8cj090',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:12:42Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=17d96116-83a4-40d0-9dcd-bd5072238621,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.693 226326 DEBUG nova.network.os_vif_util [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "93f5d287-34ec-424d-8776-df002083762e", "address": "fa:16:3e:00:c7:32", "network": {"id": "ae1cb66c-0987-4156-9bdb-cb2a08957306", "bridge": "br-int", "label": "tempest-network-smoke--514366077", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93f5d287-34", "ovs_interfaceid": "93f5d287-34ec-424d-8776-df002083762e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.695 226326 DEBUG nova.network.os_vif_util [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.696 226326 DEBUG os_vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.703 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93f5d287-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.707 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.709 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.713 226326 INFO os_vif [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:c7:32,bridge_name='br-int',has_traffic_filtering=True,id=93f5d287-34ec-424d-8776-df002083762e,network=Network(ae1cb66c-0987-4156-9bdb-cb2a08957306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93f5d287-34')#033[00m
Jan 26 05:13:11 np0005595445 systemd[1]: var-lib-containers-storage-overlay-4db7298031f92053143f67de5fae7608006766babe131c36a52e198214e577e1-merged.mount: Deactivated successfully.
Jan 26 05:13:11 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f-userdata-shm.mount: Deactivated successfully.
Jan 26 05:13:11 np0005595445 podman[235344]: 2026-01-26 10:13:11.72388907 +0000 UTC m=+0.478927901 container cleanup 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 05:13:11 np0005595445 systemd[1]: libpod-conmon-4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f.scope: Deactivated successfully.
Jan 26 05:13:11 np0005595445 podman[235402]: 2026-01-26 10:13:11.903967471 +0000 UTC m=+0.152109277 container remove 4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.910 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[28f549a5-4cd2-4576-937e-8a60af3123a8]: (4, ('Mon Jan 26 10:13:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 (4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f)\n4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f\nMon Jan 26 10:13:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 (4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f)\n4e473c1ad3d3e070d32daaf08f83443353c7ee6bcc28b44c496d650a78b78a6f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.912 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[9b705132-3ae3-4b34-81de-5a625ae13bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.914 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae1cb66c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.916 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 kernel: tapae1cb66c-00: left promiscuous mode
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.924 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[662775cd-bf9f-4a26-9e37-f396aa217a87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.927 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.928 226326 DEBUG oslo_concurrency.lockutils [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.929 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.929 226326 DEBUG nova.compute.manager [req-ff4c01a0-7ff1-4dc8-bf34-9c642c122574 req-e19dc28a-b541-44fc-863a-94c282fb2c53 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-unplugged-93f5d287-34ec-424d-8776-df002083762e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 05:13:11 np0005595445 nova_compute[226322]: 2026-01-26 10:13:11.934 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.944 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[6e748751-705d-4735-a605-c76f235210f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.946 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a1a263-18fb-4602-895b-77ea0eb182be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.966 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c71d3fc2-4fc8-44c9-835e-cba64ac57562]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431920, 'reachable_time': 20765, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235420, 'error': None, 'target': 'ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.971 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae1cb66c-0987-4156-9bdb-cb2a08957306 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 05:13:11 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:11.971 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[3512caa1-81cb-4598-9d42-76405e84b099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:13:11 np0005595445 systemd[1]: run-netns-ovnmeta\x2dae1cb66c\x2d0987\x2d4156\x2d9bdb\x2dcb2a08957306.mount: Deactivated successfully.
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.471 226326 INFO nova.virt.libvirt.driver [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deleting instance files /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621_del#033[00m
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.472 226326 INFO nova.virt.libvirt.driver [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deletion of /var/lib/nova/instances/17d96116-83a4-40d0-9dcd-bd5072238621_del complete#033[00m
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.531 226326 INFO nova.compute.manager [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 1.47 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.532 226326 DEBUG oslo.service.loopingcall [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.533 226326 DEBUG nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 05:13:12 np0005595445 nova_compute[226322]: 2026-01-26 10:13:12.533 226326 DEBUG nova.network.neutron [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 05:13:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:13.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.474 226326 DEBUG nova.network.neutron [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.501 226326 INFO nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Took 0.97 seconds to deallocate network for instance.#033[00m
Jan 26 05:13:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:13.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.546 226326 DEBUG nova.compute.manager [req-ce3bdba2-53ca-4c2c-b78f-868aa9aa2dd9 req-3dd56532-8664-45ae-a04a-7e062b24d5be b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-deleted-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.565 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.565 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.615 226326 DEBUG oslo_concurrency.processutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.994 226326 DEBUG nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.995 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.995 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 DEBUG oslo_concurrency.lockutils [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 DEBUG nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] No waiting events found dispatching network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:13:13 np0005595445 nova_compute[226322]: 2026-01-26 10:13:13.996 226326 WARNING nova.compute.manager [req-ea02cfb7-930f-4e15-a5c3-327bbdd4e57c req-090ec7a2-64ab-4820-bda9-0bd24f87bc57 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Received unexpected event network-vif-plugged-93f5d287-34ec-424d-8776-df002083762e for instance with vm_state deleted and task_state None.#033[00m
Jan 26 05:13:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:14 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:14 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:14 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:13:14 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1814321471' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.061 226326 DEBUG oslo_concurrency.processutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.069 226326 DEBUG nova.compute.provider_tree [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.083 226326 DEBUG nova.scheduler.client.report [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.109 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.137 226326 INFO nova.scheduler.client.report [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance 17d96116-83a4-40d0-9dcd-bd5072238621#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.196 226326 DEBUG oslo_concurrency.lockutils [None req-ad4e2b86-dc04-4993-be0f-6bd51e70ebf6 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "17d96116-83a4-40d0-9dcd-bd5072238621" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:14 np0005595445 nova_compute[226322]: 2026-01-26 10:13:14.793 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:13:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:13:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:13:15 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:13:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:15.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:15.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:16 np0005595445 nova_compute[226322]: 2026-01-26 10:13:16.709 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:17.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:18 np0005595445 nova_compute[226322]: 2026-01-26 10:13:18.333 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:19 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:19 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:19.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:19 np0005595445 nova_compute[226322]: 2026-01-26 10:13:19.794 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:13:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:13:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:21.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:21 np0005595445 nova_compute[226322]: 2026-01-26 10:13:21.713 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:22 np0005595445 podman[235554]: 2026-01-26 10:13:22.312546397 +0000 UTC m=+0.081670936 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 05:13:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:23.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:23 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.878 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:13:23 np0005595445 nova_compute[226322]: 2026-01-26 10:13:23.879 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:23 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.880 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:13:23 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:23.882 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:13:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:24 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:24 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:24 np0005595445 nova_compute[226322]: 2026-01-26 10:13:24.797 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:25.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:25.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:26 np0005595445 nova_compute[226322]: 2026-01-26 10:13:26.298 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422391.2958548, 17d96116-83a4-40d0-9dcd-bd5072238621 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:13:26 np0005595445 nova_compute[226322]: 2026-01-26 10:13:26.298 226326 INFO nova.compute.manager [-] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] VM Stopped (Lifecycle Event)#033[00m
Jan 26 05:13:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:26 np0005595445 nova_compute[226322]: 2026-01-26 10:13:26.637 226326 DEBUG nova.compute.manager [None req-c3236d7f-3751-4c0d-8edd-1d434244810c - - - - - -] [instance: 17d96116-83a4-40d0-9dcd-bd5072238621] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:13:26 np0005595445 nova_compute[226322]: 2026-01-26 10:13:26.716 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:27.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:27.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:29 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:29 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:29.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:29.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:29 np0005595445 nova_compute[226322]: 2026-01-26 10:13:29.799 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:31.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:31 np0005595445 nova_compute[226322]: 2026-01-26 10:13:31.719 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:31 np0005595445 nova_compute[226322]: 2026-01-26 10:13:31.811 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:33.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:33.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:34 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:34 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.436 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.437 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.437 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.688 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.730 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.730 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:34 np0005595445 nova_compute[226322]: 2026-01-26 10:13:34.801 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:35.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:35.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:35 np0005595445 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:35 np0005595445 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:35 np0005595445 nova_compute[226322]: 2026-01-26 10:13:35.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:13:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:36 np0005595445 nova_compute[226322]: 2026-01-26 10:13:36.722 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:37.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:37.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.788 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:13:37 np0005595445 nova_compute[226322]: 2026-01-26 10:13:37.789 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:13:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:13:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2944046988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.207 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:13:38 np0005595445 podman[235630]: 2026-01-26 10:13:38.307859092 +0000 UTC m=+0.081405539 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.368 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4901MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.369 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.507 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.508 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:13:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:13:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/486237949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.989 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:13:38 np0005595445 nova_compute[226322]: 2026-01-26 10:13:38.994 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:13:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:39 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:39 np0005595445 nova_compute[226322]: 2026-01-26 10:13:39.219 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:13:39 np0005595445 nova_compute[226322]: 2026-01-26 10:13:39.376 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:13:39 np0005595445 nova_compute[226322]: 2026-01-26 10:13:39.377 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:39.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:39.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:39 np0005595445 nova_compute[226322]: 2026-01-26 10:13:39.803 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:41.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:41.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:41 np0005595445 nova_compute[226322]: 2026-01-26 10:13:41.726 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:43.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:43.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:44 np0005595445 nova_compute[226322]: 2026-01-26 10:13:44.805 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:45.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:45.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:46 np0005595445 nova_compute[226322]: 2026-01-26 10:13:46.766 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:47.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:47.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:49.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:49.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:49 np0005595445 nova_compute[226322]: 2026-01-26 10:13:49.808 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:51.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:51.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:51 np0005595445 nova_compute[226322]: 2026-01-26 10:13:51.769 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:53 np0005595445 podman[235713]: 2026-01-26 10:13:53.292081697 +0000 UTC m=+0.068792913 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 26 05:13:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:53.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:13:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:13:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:13:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:13:54 np0005595445 nova_compute[226322]: 2026-01-26 10:13:54.810 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:55.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:13:56 np0005595445 nova_compute[226322]: 2026-01-26 10:13:56.772 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:13:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:13:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:13:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:13:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:57.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:13:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:13:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:13:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:13:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:13:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:13:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.003000089s ======
Jan 26 05:13:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:13:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000089s
Jan 26 05:13:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:13:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:13:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:13:59.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:13:59 np0005595445 nova_compute[226322]: 2026-01-26 10:13:59.812 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:01.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:01.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:01 np0005595445 nova_compute[226322]: 2026-01-26 10:14:01.775 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:03.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:03.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.424 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.425 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.446 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.539 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.539 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.546 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.546 226326 INFO nova.compute.claims [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.693 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:04 np0005595445 nova_compute[226322]: 2026-01-26 10:14:04.814 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:14:05 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/12435975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.177 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.182 226326 DEBUG nova.compute.provider_tree [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.201 226326 DEBUG nova.scheduler.client.report [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.232 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.233 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.287 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.288 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.306 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.325 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.412 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.414 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.414 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating image(s)#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.439 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.467 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:14:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:05.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.491 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.494 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.545 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.571 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.574 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 51ec8779-f667-4f68-853c-545679d761b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:14:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:05.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.822 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c 51ec8779-f667-4f68-853c-545679d761b9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.890 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 05:14:05 np0005595445 nova_compute[226322]: 2026-01-26 10:14:05.996 226326 DEBUG nova.objects.instance [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.010 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.010 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Ensure instance console log exists: /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.011 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.222 226326 DEBUG nova.policy [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 05:14:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:06 np0005595445 nova_compute[226322]: 2026-01-26 10:14:06.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:07.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:07.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.365 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Successfully updated port: 386a7730-6a16-4b18-b368-561762a8f7af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.453 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.454 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.454 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.459 226326 DEBUG nova.compute.manager [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-changed-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.460 226326 DEBUG nova.compute.manager [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing instance network info cache due to event network-changed-386a7730-6a16-4b18-b368-561762a8f7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:14:08 np0005595445 nova_compute[226322]: 2026-01-26 10:14:08.460 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:14:09 np0005595445 nova_compute[226322]: 2026-01-26 10:14:09.106 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 05:14:09 np0005595445 podman[235956]: 2026-01-26 10:14:09.31017897 +0000 UTC m=+0.090285013 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 05:14:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:09.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:14:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:09.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:14:09 np0005595445 nova_compute[226322]: 2026-01-26 10:14:09.815 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.495 226326 DEBUG nova.network.neutron [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.514 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.514 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance network_info: |[{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.515 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.515 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.517 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start _get_guest_xml network_info=[{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.521 226326 WARNING nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.526 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.526 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.531 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.531 226326 DEBUG nova.virt.libvirt.host [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.532 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.533 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.534 226326 DEBUG nova.virt.hardware [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 05:14:10 np0005595445 nova_compute[226322]: 2026-01-26 10:14:10.536 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:10 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:14:10 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/49620544' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.013 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.035 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.039 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:14:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:14:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:14:11 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3883266146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.526 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.529 226326 DEBUG nova.virt.libvirt.vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:14:05Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.529 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.530 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.531 226326 DEBUG nova.objects.instance [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.544 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] End _get_guest_xml xml=<domain type="kvm">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <uuid>51ec8779-f667-4f68-853c-545679d761b9</uuid>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <name>instance-00000008</name>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <memory>131072</memory>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <vcpu>1</vcpu>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <metadata>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:name>tempest-TestNetworkBasicOps-server-1744381875</nova:name>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:creationTime>2026-01-26 10:14:10</nova:creationTime>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:flavor name="m1.nano">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:memory>128</nova:memory>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:disk>1</nova:disk>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:swap>0</nova:swap>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:vcpus>1</nova:vcpus>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </nova:flavor>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:owner>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </nova:owner>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <nova:ports>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <nova:port uuid="386a7730-6a16-4b18-b368-561762a8f7af">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        </nova:port>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </nova:ports>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </nova:instance>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </metadata>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <sysinfo type="smbios">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <system>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="manufacturer">RDO</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="product">OpenStack Compute</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="serial">51ec8779-f667-4f68-853c-545679d761b9</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="uuid">51ec8779-f667-4f68-853c-545679d761b9</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <entry name="family">Virtual Machine</entry>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </system>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </sysinfo>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <os>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <boot dev="hd"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <smbios mode="sysinfo"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <acpi/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <apic/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <vmcoreinfo/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <clock offset="utc">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <timer name="hpet" present="no"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </clock>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <cpu mode="host-model" match="exact">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <disk type="network" device="disk">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/51ec8779-f667-4f68-853c-545679d761b9_disk">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <target dev="vda" bus="virtio"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <disk type="network" device="cdrom">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/51ec8779-f667-4f68-853c-545679d761b9_disk.config">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <target dev="sda" bus="sata"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <interface type="ethernet">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <mac address="fa:16:3e:d6:e4:a1"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <mtu size="1442"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <target dev="tap386a7730-6a"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <serial type="pty">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <log file="/var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/console.log" append="off"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </serial>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <video>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <input type="tablet" bus="usb"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <rng model="virtio">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <backend model="random">/dev/urandom</backend>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <controller type="usb" index="0"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    <memballoon model="virtio">
Jan 26 05:14:11 np0005595445 nova_compute[226322]:      <stats period="10"/>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:    </memballoon>
Jan 26 05:14:11 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:14:11 np0005595445 nova_compute[226322]: </domain>
Jan 26 05:14:11 np0005595445 nova_compute[226322]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Preparing to wait for external event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.546 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.547 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.547 226326 DEBUG nova.virt.libvirt.vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:14:05Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.548 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.548 226326 DEBUG nova.network.os_vif_util [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.549 226326 DEBUG os_vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.550 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.551 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.554 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap386a7730-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.555 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap386a7730-6a, col_values=(('external_ids', {'iface-id': '386a7730-6a16-4b18-b368-561762a8f7af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:e4:a1', 'vm-uuid': '51ec8779-f667-4f68-853c-545679d761b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:11 np0005595445 NetworkManager[49073]: <info>  [1769422451.5593] manager: (tap386a7730-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.566 226326 INFO os_vif [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a')#033[00m
Jan 26 05:14:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:14:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.608 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:d6:e4:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.609 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Using config drive#033[00m
Jan 26 05:14:11 np0005595445 nova_compute[226322]: 2026-01-26 10:14:11.633 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.166 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updated VIF entry in instance network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.167 226326 DEBUG nova.network.neutron [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.188 226326 DEBUG oslo_concurrency.lockutils [req-eb7a636b-e5ae-49f8-bb40-ddfaa0576843 req-d58107c8-25f1-4e59-a671-4579ab4d606d b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.317 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Creating config drive at /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.321 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7ylmt3m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.446 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg7ylmt3m" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.478 226326 DEBUG nova.storage.rbd_utils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image 51ec8779-f667-4f68-853c-545679d761b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.482 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config 51ec8779-f667-4f68-853c-545679d761b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.664 226326 DEBUG oslo_concurrency.processutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config 51ec8779-f667-4f68-853c-545679d761b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.665 226326 INFO nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deleting local config drive /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9/disk.config because it was imported into RBD.#033[00m
Jan 26 05:14:12 np0005595445 kernel: tap386a7730-6a: entered promiscuous mode
Jan 26 05:14:12 np0005595445 NetworkManager[49073]: <info>  [1769422452.7146] manager: (tap386a7730-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 05:14:12 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:12Z|00056|binding|INFO|Claiming lport 386a7730-6a16-4b18-b368-561762a8f7af for this chassis.
Jan 26 05:14:12 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:12Z|00057|binding|INFO|386a7730-6a16-4b18-b368-561762a8f7af: Claiming fa:16:3e:d6:e4:a1 10.100.0.10
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.715 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.718 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.720 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.727 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:e4:a1 10.100.0.10'], port_security=['fa:16:3e:d6:e4:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '51ec8779-f667-4f68-853c-545679d761b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '75a6a4cb-bd58-457c-b449-9db5f70f3f78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01b44e6c-3a91-48f0-92f1-3334bccbc3c9, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=386a7730-6a16-4b18-b368-561762a8f7af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.729 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 386a7730-6a16-4b18-b368-561762a8f7af in datapath f91dcb4b-184c-45d6-a0e9-285bb6bc3464 bound to our chassis#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.730 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f91dcb4b-184c-45d6-a0e9-285bb6bc3464#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.740 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[986479cd-6a1a-4728-9e52-42b449f27485]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.740 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf91dcb4b-11 in ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 05:14:12 np0005595445 systemd-machined[194876]: New machine qemu-4-instance-00000008.
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf91dcb4b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd5c6d3-07a3-45ca-83f5-49c9d0ec9696]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.742 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cb344e-3d45-4fdc-afc0-8020c510fd94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 systemd-udevd[236120]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.753 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd6b589-adc0-42f9-93ab-9a44e448ee4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 NetworkManager[49073]: <info>  [1769422452.7584] device (tap386a7730-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 05:14:12 np0005595445 NetworkManager[49073]: <info>  [1769422452.7591] device (tap386a7730-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 05:14:12 np0005595445 systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.775 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[98d5f69a-18b1-444c-a415-c742b6064c96]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:12 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:12Z|00058|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af ovn-installed in OVS
Jan 26 05:14:12 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:12Z|00059|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af up in Southbound
Jan 26 05:14:12 np0005595445 nova_compute[226322]: 2026-01-26 10:14:12.784 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.803 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[b922dd51-b7ae-41be-a174-7deb07e1ce0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 NetworkManager[49073]: <info>  [1769422452.8094] manager: (tapf91dcb4b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.809 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4e65f143-6056-4a54-b3ed-5a2e8bf1f4a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.837 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc33c62-799e-4570-af6c-06549661905c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.839 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1677a7-01de-4581-8ab3-062d57d13b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 NetworkManager[49073]: <info>  [1769422452.8595] device (tapf91dcb4b-10): carrier: link connected
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.864 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9658aa09-091c-4e67-9e48-6be98f1744ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.883 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfa9bcf-e1d0-4bce-85f5-91104a66f2d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf91dcb4b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:0e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441342, 'reachable_time': 28311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236152, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.897 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c416f8bc-f822-4464-84f6-7fb6b5df901a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:efb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441342, 'tstamp': 441342}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236153, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.916 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[7db23196-494d-4b7d-8984-3172b3f1f746]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf91dcb4b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:0e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441342, 'reachable_time': 28311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236154, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:12.946 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0edc154b-6d8b-429e-98a2-6fdc47cd1915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.002 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa341f8-029f-4ed9-b6ea-68abbd8991a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.003 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf91dcb4b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.003 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.004 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf91dcb4b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:13 np0005595445 kernel: tapf91dcb4b-10: entered promiscuous mode
Jan 26 05:14:13 np0005595445 NetworkManager[49073]: <info>  [1769422453.0062] manager: (tapf91dcb4b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.007 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.008 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf91dcb4b-10, col_values=(('external_ids', {'iface-id': '242dde27-5aff-4cac-b664-221ab4bfb94f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.009 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:13 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:13Z|00060|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.010 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.010 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.011 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[fd31ca85-575d-46ca-9904-de978ac4d435]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.012 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: global
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    log         /dev/log local0 debug
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    log-tag     haproxy-metadata-proxy-f91dcb4b-184c-45d6-a0e9-285bb6bc3464
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    user        root
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    group       root
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    maxconn     1024
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    pidfile     /var/lib/neutron/external/pids/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.pid.haproxy
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    daemon
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: defaults
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    log global
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    mode http
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    option httplog
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    option dontlognull
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    option http-server-close
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    option forwardfor
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    retries                 3
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    timeout http-request    30s
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    timeout connect         30s
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    timeout client          32s
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    timeout server          32s
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    timeout http-keep-alive 30s
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: listen listener
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    bind 169.254.169.254:80
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]:    http-request add-header X-OVN-Network-ID f91dcb4b-184c-45d6-a0e9-285bb6bc3464
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 05:14:13 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:13.013 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'env', 'PROCESS_TAG=haproxy-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f91dcb4b-184c-45d6-a0e9-285bb6bc3464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.022 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.257 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.257486, 51ec8779-f667-4f68-853c-545679d761b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.258 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Started (Lifecycle Event)#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.276 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.280 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.2576222, 51ec8779-f667-4f68-853c-545679d761b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.280 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Paused (Lifecycle Event)#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.297 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.300 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG nova.compute.manager [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.307 226326 DEBUG oslo_concurrency.lockutils [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.308 226326 DEBUG nova.compute.manager [req-417921f6-960d-4084-8629-e02aa620a4c2 req-f39bde49-ef2c-4f96-b020-f4da5939b1fc b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Processing event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.308 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.312 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.314 226326 INFO nova.virt.libvirt.driver [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance spawned successfully.#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.315 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.327 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.328 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422453.310944, 51ec8779-f667-4f68-853c-545679d761b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.328 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Resumed (Lifecycle Event)#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.347 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.351 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.351 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.352 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.352 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.353 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.353 226326 DEBUG nova.virt.libvirt.driver [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.356 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.414 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:14:13 np0005595445 podman[236226]: 2026-01-26 10:14:13.428519349 +0000 UTC m=+0.110285320 container create 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.433 226326 INFO nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 8.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.433 226326 DEBUG nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:14:13 np0005595445 podman[236226]: 2026-01-26 10:14:13.343381089 +0000 UTC m=+0.025147090 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 05:14:13 np0005595445 systemd[1]: Started libpod-conmon-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope.
Jan 26 05:14:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:13 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:14:13 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae0fb92a0feb3d3a75cace414af4b8a6d6b232b5f42090c39d7bdc2f5ca3bf09/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 05:14:13 np0005595445 podman[236226]: 2026-01-26 10:14:13.515442211 +0000 UTC m=+0.197208202 container init 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 26 05:14:13 np0005595445 podman[236226]: 2026-01-26 10:14:13.520944425 +0000 UTC m=+0.202710396 container start 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.529 226326 INFO nova.compute.manager [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 9.02 seconds to build instance.#033[00m
Jan 26 05:14:13 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : New worker (236247) forked
Jan 26 05:14:13 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : Loading success.
Jan 26 05:14:13 np0005595445 nova_compute[226322]: 2026-01-26 10:14:13.551 226326 DEBUG oslo_concurrency.lockutils [None req-66f58136-e58c-4e2f-a79a-9097a9f32863 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 26 05:14:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:13.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.162445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454162469, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1162, "num_deletes": 501, "total_data_size": 1931285, "memory_usage": 1966512, "flush_reason": "Manual Compaction"}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454175496, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1267861, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30075, "largest_seqno": 31232, "table_properties": {"data_size": 1263060, "index_size": 1877, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14321, "raw_average_key_size": 19, "raw_value_size": 1251336, "raw_average_value_size": 1704, "num_data_blocks": 82, "num_entries": 734, "num_filter_entries": 734, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422388, "oldest_key_time": 1769422388, "file_creation_time": 1769422454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 13088 microseconds, and 4614 cpu microseconds.
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.175529) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1267861 bytes OK
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.175546) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.176993) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177005) EVENT_LOG_v1 {"time_micros": 1769422454177001, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177019) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1924654, prev total WAL file size 1924654, number of live WAL files 2.
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177772) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1238KB)], [57(16MB)]
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454177963, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18552709, "oldest_snapshot_seqno": -1}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5762 keys, 12337760 bytes, temperature: kUnknown
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454298168, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12337760, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12301098, "index_size": 21128, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 148993, "raw_average_key_size": 25, "raw_value_size": 12198773, "raw_average_value_size": 2117, "num_data_blocks": 846, "num_entries": 5762, "num_filter_entries": 5762, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422454, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.298389) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12337760 bytes
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.301948) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.3 rd, 102.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.5 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(24.4) write-amplify(9.7) OK, records in: 6779, records dropped: 1017 output_compression: NoCompression
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.301969) EVENT_LOG_v1 {"time_micros": 1769422454301959, "job": 34, "event": "compaction_finished", "compaction_time_micros": 120242, "compaction_time_cpu_micros": 26053, "output_level": 6, "num_output_files": 1, "total_output_size": 12337760, "num_input_records": 6779, "num_output_records": 5762, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454302306, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422454305297, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.177516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305351) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:14:14.305355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:14:14 np0005595445 nova_compute[226322]: 2026-01-26 10:14:14.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000059s ======
Jan 26 05:14:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000059s
Jan 26 05:14:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:15.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.603 226326 DEBUG nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.604 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.606 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.606 226326 DEBUG oslo_concurrency.lockutils [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.607 226326 DEBUG nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:14:15 np0005595445 nova_compute[226322]: 2026-01-26 10:14:15.607 226326 WARNING nova.compute.manager [req-1907636a-56a8-4089-ad10-4c11149d8eea req-c4d4f985-c43f-445a-bfb9-56cc5e228e5a b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received unexpected event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with vm_state active and task_state None.#033[00m
Jan 26 05:14:16 np0005595445 NetworkManager[49073]: <info>  [1769422456.3461] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 05:14:16 np0005595445 NetworkManager[49073]: <info>  [1769422456.3470] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 05:14:16 np0005595445 nova_compute[226322]: 2026-01-26 10:14:16.346 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:16 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:16Z|00061|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 05:14:16 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:16Z|00062|binding|INFO|Releasing lport 242dde27-5aff-4cac-b664-221ab4bfb94f from this chassis (sb_readonly=0)
Jan 26 05:14:16 np0005595445 nova_compute[226322]: 2026-01-26 10:14:16.395 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:16 np0005595445 nova_compute[226322]: 2026-01-26 10:14:16.398 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:16 np0005595445 nova_compute[226322]: 2026-01-26 10:14:16.557 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.074 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.075 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.076 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.077 226326 INFO nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Terminating instance#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.078 226326 DEBUG nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 05:14:17 np0005595445 kernel: tap386a7730-6a (unregistering): left promiscuous mode
Jan 26 05:14:17 np0005595445 NetworkManager[49073]: <info>  [1769422457.1976] device (tap386a7730-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 05:14:17 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:17Z|00063|binding|INFO|Releasing lport 386a7730-6a16-4b18-b368-561762a8f7af from this chassis (sb_readonly=0)
Jan 26 05:14:17 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:17Z|00064|binding|INFO|Setting lport 386a7730-6a16-4b18-b368-561762a8f7af down in Southbound
Jan 26 05:14:17 np0005595445 ovn_controller[133670]: 2026-01-26T10:14:17Z|00065|binding|INFO|Removing iface tap386a7730-6a ovn-installed in OVS
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.229 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.233 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.238 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:e4:a1 10.100.0.10'], port_security=['fa:16:3e:d6:e4:a1 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '51ec8779-f667-4f68-853c-545679d761b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1712540863', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '75a6a4cb-bd58-457c-b449-9db5f70f3f78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=01b44e6c-3a91-48f0-92f1-3334bccbc3c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=386a7730-6a16-4b18-b368-561762a8f7af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.239 143326 INFO neutron.agent.ovn.metadata.agent [-] Port 386a7730-6a16-4b18-b368-561762a8f7af in datapath f91dcb4b-184c-45d6-a0e9-285bb6bc3464 unbound from our chassis#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.240 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.241 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[eba20e8b-44e9-46a1-8dcf-e5be974520a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.241 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 namespace which is not needed anymore#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.251 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 26 05:14:17 np0005595445 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 4.270s CPU time.
Jan 26 05:14:17 np0005595445 systemd-machined[194876]: Machine qemu-4-instance-00000008 terminated.
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.319 226326 INFO nova.virt.libvirt.driver [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Instance destroyed successfully.#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.320 226326 DEBUG nova.objects.instance [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid 51ec8779-f667-4f68-853c-545679d761b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.334 226326 DEBUG nova.virt.libvirt.vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744381875',display_name='tempest-TestNetworkBasicOps-server-1744381875',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744381875',id=8,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAvyk/Go9GRUgKM9ccY9S+v3kz7mMcjasfiu2L0DFwS5oDB3yiDxzsS07sSdloLffH02y1mQmQVvZg5ozr00/t6RFvm10CNHliA9YweQcnIUE4iIcxPZtBU7hWU+AN6Ubw==',key_name='tempest-TestNetworkBasicOps-212494757',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:14:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-u51vza1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:14:13Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=51ec8779-f667-4f68-853c-545679d761b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.334 226326 DEBUG nova.network.os_vif_util [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.335 226326 DEBUG nova.network.os_vif_util [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.335 226326 DEBUG os_vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.336 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.336 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap386a7730-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.338 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.344 226326 INFO os_vif [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:e4:a1,bridge_name='br-int',has_traffic_filtering=True,id=386a7730-6a16-4b18-b368-561762a8f7af,network=Network(f91dcb4b-184c-45d6-a0e9-285bb6bc3464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap386a7730-6a')#033[00m
Jan 26 05:14:17 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : haproxy version is 2.8.14-c23fe91
Jan 26 05:14:17 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [NOTICE]   (236245) : path to executable is /usr/sbin/haproxy
Jan 26 05:14:17 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [WARNING]  (236245) : Exiting Master process...
Jan 26 05:14:17 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [ALERT]    (236245) : Current worker (236247) exited with code 143 (Terminated)
Jan 26 05:14:17 np0005595445 neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464[236241]: [WARNING]  (236245) : All workers exited. Exiting... (0)
Jan 26 05:14:17 np0005595445 systemd[1]: libpod-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope: Deactivated successfully.
Jan 26 05:14:17 np0005595445 podman[236296]: 2026-01-26 10:14:17.392377362 +0000 UTC m=+0.050704693 container died 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 26 05:14:17 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020-userdata-shm.mount: Deactivated successfully.
Jan 26 05:14:17 np0005595445 systemd[1]: var-lib-containers-storage-overlay-ae0fb92a0feb3d3a75cace414af4b8a6d6b232b5f42090c39d7bdc2f5ca3bf09-merged.mount: Deactivated successfully.
Jan 26 05:14:17 np0005595445 podman[236296]: 2026-01-26 10:14:17.42919543 +0000 UTC m=+0.087522761 container cleanup 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 26 05:14:17 np0005595445 systemd[1]: libpod-conmon-2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020.scope: Deactivated successfully.
Jan 26 05:14:17 np0005595445 podman[236343]: 2026-01-26 10:14:17.492616981 +0000 UTC m=+0.045620132 container remove 2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 26 05:14:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:17.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.498 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[36214e9f-d97f-4b94-a661-e539f4bf6900]: (4, ('Mon Jan 26 10:14:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 (2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020)\n2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020\nMon Jan 26 10:14:17 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 (2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020)\n2fcaef581211e67fb1d438562bd728c9788448dd06e8bdc3ee14c8f618298020\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.500 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[de535640-afd2-4dbd-8092-86da3e2e813f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.501 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf91dcb4b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.502 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 kernel: tapf91dcb4b-10: left promiscuous mode
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.521 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[4125d8f9-fa97-445f-b0ea-4022e6b6f3d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.541 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[b39eab59-1a73-4e57-8a10-975f5cbed1f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.542 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[81046efe-1171-40de-8342-ae21e9cf01c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.562 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf46682-64b8-4fdd-9955-029e01e3e7e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441336, 'reachable_time': 43919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236359, 'error': None, 'target': 'ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 systemd[1]: run-netns-ovnmeta\x2df91dcb4b\x2d184c\x2d45d6\x2da0e9\x2d285bb6bc3464.mount: Deactivated successfully.
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.565 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f91dcb4b-184c-45d6-a0e9-285bb6bc3464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 05:14:17 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:17.565 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[52bc1476-2007-4c26-bb54-439faa430d50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:14:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:17.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.665 226326 DEBUG nova.compute.manager [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-changed-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.665 226326 DEBUG nova.compute.manager [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing instance network info cache due to event network-changed-386a7730-6a16-4b18-b368-561762a8f7af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.666 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Refreshing network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.741 226326 INFO nova.virt.libvirt.driver [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deleting instance files /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9_del#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.742 226326 INFO nova.virt.libvirt.driver [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deletion of /var/lib/nova/instances/51ec8779-f667-4f68-853c-545679d761b9_del complete#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.813 226326 INFO nova.compute.manager [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG oslo.service.loopingcall [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 05:14:17 np0005595445 nova_compute[226322]: 2026-01-26 10:14:17.814 226326 DEBUG nova.network.neutron [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 05:14:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:14:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:19.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:14:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:14:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:19.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.902 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.903 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.903 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG oslo_concurrency.lockutils [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:14:19 np0005595445 nova_compute[226322]: 2026-01-26 10:14:19.904 226326 DEBUG nova.compute.manager [req-04875165-a0ef-4565-a618-107e586658b7 req-8bda776c-de0b-4179-9c23-dda2d541e618 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-unplugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 05:14:21 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:14:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:21.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:21.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.758 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updated VIF entry in instance network info cache for port 386a7730-6a16-4b18-b368-561762a8f7af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.759 226326 DEBUG nova.network.neutron [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [{"id": "386a7730-6a16-4b18-b368-561762a8f7af", "address": "fa:16:3e:d6:e4:a1", "network": {"id": "f91dcb4b-184c-45d6-a0e9-285bb6bc3464", "bridge": "br-int", "label": "tempest-network-smoke--753987758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap386a7730-6a", "ovs_interfaceid": "386a7730-6a16-4b18-b368-561762a8f7af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.777 226326 DEBUG oslo_concurrency.lockutils [req-ae1e18f9-a5ef-457c-89f6-ca94869baf3a req-89372e82-9d81-444f-a8cc-a59a98645539 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-51ec8779-f667-4f68-853c-545679d761b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.921 226326 DEBUG nova.network.neutron [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.939 226326 INFO nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Took 4.12 seconds to deallocate network for instance.#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.994 226326 DEBUG nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.994 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "51ec8779-f667-4f68-853c-545679d761b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.995 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.995 226326 DEBUG oslo_concurrency.lockutils [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.996 226326 DEBUG nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] No waiting events found dispatching network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.996 226326 WARNING nova.compute.manager [req-e46b7dd0-ac26-4a95-b391-a57697225b73 req-14264520-c892-48b6-b7e9-8a886cf29dd8 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Received unexpected event network-vif-plugged-386a7730-6a16-4b18-b368-561762a8f7af for instance with vm_state active and task_state deleting.#033[00m
Jan 26 05:14:21 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.999 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:21.999 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.053 226326 DEBUG oslo_concurrency.processutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.338 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:14:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:14:22 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:14:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:14:22 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3329353095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.521 226326 DEBUG oslo_concurrency.processutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.528 226326 DEBUG nova.compute.provider_tree [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.624 226326 DEBUG nova.scheduler.client.report [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.645 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.679 226326 INFO nova.scheduler.client.report [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance 51ec8779-f667-4f68-853c-545679d761b9#033[00m
Jan 26 05:14:22 np0005595445 nova_compute[226322]: 2026-01-26 10:14:22.747 226326 DEBUG oslo_concurrency.lockutils [None req-722a5c73-29ea-4ce0-8d70-f01a6ffd0281 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "51ec8779-f667-4f68-853c-545679d761b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 26 05:14:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:23.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 26 05:14:24 np0005595445 podman[236468]: 2026-01-26 10:14:24.281927039 +0000 UTC m=+0.055282600 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 26 05:14:24 np0005595445 nova_compute[226322]: 2026-01-26 10:14:24.783 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:24 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:24.783 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:14:24 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:24.785 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:14:24 np0005595445 nova_compute[226322]: 2026-01-26 10:14:24.821 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:25.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:25.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:27 np0005595445 nova_compute[226322]: 2026-01-26 10:14:27.340 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:27.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:27.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:14:28 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:14:28 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:28.789 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:14:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:29.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:29.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:29 np0005595445 nova_compute[226322]: 2026-01-26 10:14:29.824 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:31.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:32 np0005595445 nova_compute[226322]: 2026-01-26 10:14:32.318 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422457.316042, 51ec8779-f667-4f68-853c-545679d761b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:14:32 np0005595445 nova_compute[226322]: 2026-01-26 10:14:32.318 226326 INFO nova.compute.manager [-] [instance: 51ec8779-f667-4f68-853c-545679d761b9] VM Stopped (Lifecycle Event)#033[00m
Jan 26 05:14:32 np0005595445 nova_compute[226322]: 2026-01-26 10:14:32.341 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:32 np0005595445 nova_compute[226322]: 2026-01-26 10:14:32.349 226326 DEBUG nova.compute.manager [None req-daa35117-bc0e-4564-889e-3cdff8461da1 - - - - - -] [instance: 51ec8779-f667-4f68-853c-545679d761b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:14:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:33 np0005595445 nova_compute[226322]: 2026-01-26 10:14:33.378 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:33 np0005595445 nova_compute[226322]: 2026-01-26 10:14:33.379 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:33.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:33.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:33 np0005595445 nova_compute[226322]: 2026-01-26 10:14:33.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.715 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:14:34 np0005595445 nova_compute[226322]: 2026-01-26 10:14:34.826 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:35.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:35.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:35 np0005595445 nova_compute[226322]: 2026-01-26 10:14:35.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:35 np0005595445 nova_compute[226322]: 2026-01-26 10:14:35.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:14:36 np0005595445 nova_compute[226322]: 2026-01-26 10:14:36.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.342 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:37.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:37.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:14:37 np0005595445 nova_compute[226322]: 2026-01-26 10:14:37.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:14:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3798574157' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.177 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.340 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.342 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4864MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.342 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.343 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.401 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.402 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.414 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:14:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:14:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/767898149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.859 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.869 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.889 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.916 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:14:38 np0005595445 nova_compute[226322]: 2026-01-26 10:14:38.917 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:39.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:14:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:39.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:14:39 np0005595445 nova_compute[226322]: 2026-01-26 10:14:39.827 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:40 np0005595445 podman[236593]: 2026-01-26 10:14:40.340450021 +0000 UTC m=+0.120972767 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:14:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:41.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:42 np0005595445 nova_compute[226322]: 2026-01-26 10:14:42.343 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:43.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:43.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:44 np0005595445 nova_compute[226322]: 2026-01-26 10:14:44.874 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:45.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:45.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:47 np0005595445 nova_compute[226322]: 2026-01-26 10:14:47.347 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:47.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:47.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:49.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:49.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:49 np0005595445 nova_compute[226322]: 2026-01-26 10:14:49.876 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:51.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:51.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:52 np0005595445 nova_compute[226322]: 2026-01-26 10:14:52.373 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:53.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:53 np0005595445 nova_compute[226322]: 2026-01-26 10:14:53.647 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:53.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:53 np0005595445 nova_compute[226322]: 2026-01-26 10:14:53.718 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.937 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:14:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:14:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:14:53.938 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:14:54 np0005595445 nova_compute[226322]: 2026-01-26 10:14:54.878 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:55 np0005595445 podman[236654]: 2026-01-26 10:14:55.269537715 +0000 UTC m=+0.046461203 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 05:14:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:14:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:55.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:14:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:55.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:14:57 np0005595445 nova_compute[226322]: 2026-01-26 10:14:57.377 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:14:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:57.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:14:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:57.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:14:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:14:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:14:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:14:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:14:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:14:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:14:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:14:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:14:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/96791862' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:14:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:14:59.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:14:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:14:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:14:59.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:14:59 np0005595445 nova_compute[226322]: 2026-01-26 10:14:59.880 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:15:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:01.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:15:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:15:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:01.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:15:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:02 np0005595445 nova_compute[226322]: 2026-01-26 10:15:02.381 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:03.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:15:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:03.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:15:04 np0005595445 nova_compute[226322]: 2026-01-26 10:15:04.882 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:15:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:05.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:15:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:15:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:05.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:15:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:07 np0005595445 nova_compute[226322]: 2026-01-26 10:15:07.385 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:07.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:15:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:07.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:15:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:09.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:09.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:09 np0005595445 nova_compute[226322]: 2026-01-26 10:15:09.884 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:11 np0005595445 podman[236713]: 2026-01-26 10:15:11.305334104 +0000 UTC m=+0.086686652 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 05:15:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:11.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:15:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:11.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:15:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:12 np0005595445 nova_compute[226322]: 2026-01-26 10:15:12.387 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 26 05:15:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:13.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 26 05:15:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:13.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:14 np0005595445 nova_compute[226322]: 2026-01-26 10:15:14.886 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:15.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:15:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:15.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:15:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:17 np0005595445 nova_compute[226322]: 2026-01-26 10:15:17.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:17.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:17.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:19.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:19 np0005595445 nova_compute[226322]: 2026-01-26 10:15:19.887 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:21.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:21.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:22 np0005595445 nova_compute[226322]: 2026-01-26 10:15:22.393 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:23.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:23.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:24 np0005595445 nova_compute[226322]: 2026-01-26 10:15:24.889 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:25.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:25.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:25 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.968 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:15:25 np0005595445 nova_compute[226322]: 2026-01-26 10:15:25.968 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:25 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.969 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:15:25 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:25.971 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:15:26 np0005595445 podman[236774]: 2026-01-26 10:15:26.273397376 +0000 UTC m=+0.051422957 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:15:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:27 np0005595445 nova_compute[226322]: 2026-01-26 10:15:27.396 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:27.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:27.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:28 np0005595445 ovn_controller[133670]: 2026-01-26T10:15:28Z|00066|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 05:15:29 np0005595445 podman[236917]: 2026-01-26 10:15:29.55440675 +0000 UTC m=+1.163738272 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 26 05:15:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:29.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:29.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:29 np0005595445 podman[236917]: 2026-01-26 10:15:29.816085996 +0000 UTC m=+1.425417508 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 26 05:15:29 np0005595445 nova_compute[226322]: 2026-01-26 10:15:29.891 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:31 np0005595445 podman[237059]: 2026-01-26 10:15:31.00409731 +0000 UTC m=+0.176667892 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:15:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:31 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 05:15:31 np0005595445 podman[237084]: 2026-01-26 10:15:31.087909072 +0000 UTC m=+0.062935853 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:15:31 np0005595445 podman[237059]: 2026-01-26 10:15:31.299179079 +0000 UTC m=+0.471749681 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:15:31 np0005595445 podman[237133]: 2026-01-26 10:15:31.602691137 +0000 UTC m=+0.105487354 container exec 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 26 05:15:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:31.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:31 np0005595445 podman[237153]: 2026-01-26 10:15:31.674892552 +0000 UTC m=+0.052953799 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:15:31 np0005595445 podman[237133]: 2026-01-26 10:15:31.713661922 +0000 UTC m=+0.216458099 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 05:15:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:31.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:32 np0005595445 podman[237197]: 2026-01-26 10:15:32.083163695 +0000 UTC m=+0.232575370 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:15:32 np0005595445 podman[237197]: 2026-01-26 10:15:32.331073853 +0000 UTC m=+0.480485418 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:15:32 np0005595445 nova_compute[226322]: 2026-01-26 10:15:32.398 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:32 np0005595445 podman[237263]: 2026-01-26 10:15:32.749825824 +0000 UTC m=+0.229444705 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, release=1793, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 05:15:32 np0005595445 podman[237284]: 2026-01-26 10:15:32.821850773 +0000 UTC m=+0.052793804 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=keepalived for Ceph, release=1793, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 05:15:32 np0005595445 podman[237263]: 2026-01-26 10:15:32.915386 +0000 UTC m=+0.395004861 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, description=keepalived for Ceph)
Jan 26 05:15:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:33.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:33.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:33 np0005595445 nova_compute[226322]: 2026-01-26 10:15:33.919 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:34 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:34 np0005595445 nova_compute[226322]: 2026-01-26 10:15:34.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:34 np0005595445 nova_compute[226322]: 2026-01-26 10:15:34.893 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:35.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:35 np0005595445 nova_compute[226322]: 2026-01-26 10:15:35.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:35 np0005595445 nova_compute[226322]: 2026-01-26 10:15:35.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:35 np0005595445 nova_compute[226322]: 2026-01-26 10:15:35.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:35.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:35 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:36 np0005595445 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:36 np0005595445 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:15:36 np0005595445 nova_compute[226322]: 2026-01-26 10:15:36.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:15:36 np0005595445 nova_compute[226322]: 2026-01-26 10:15:36.705 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:15:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:37 np0005595445 nova_compute[226322]: 2026-01-26 10:15:37.399 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:37.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:37 np0005595445 nova_compute[226322]: 2026-01-26 10:15:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:37 np0005595445 nova_compute[226322]: 2026-01-26 10:15:37.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:15:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:37.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:15:37 np0005595445 ceph-mon[80107]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 26 05:15:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:38 np0005595445 nova_compute[226322]: 2026-01-26 10:15:38.688 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:38 np0005595445 nova_compute[226322]: 2026-01-26 10:15:38.689 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:15:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:39.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:15:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:39.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.855 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.856 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:15:39 np0005595445 nova_compute[226322]: 2026-01-26 10:15:39.895 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:15:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/552703975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.675s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.751 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.753 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.94289016723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.753 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.754 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.921 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.922 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:15:40 np0005595445 nova_compute[226322]: 2026-01-26 10:15:40.958 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:15:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:15:41 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3497306287' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:15:41 np0005595445 nova_compute[226322]: 2026-01-26 10:15:41.427 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:15:41 np0005595445 nova_compute[226322]: 2026-01-26 10:15:41.436 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:15:41 np0005595445 nova_compute[226322]: 2026-01-26 10:15:41.467 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:15:41 np0005595445 nova_compute[226322]: 2026-01-26 10:15:41.470 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:15:41 np0005595445 nova_compute[226322]: 2026-01-26 10:15:41.470 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:15:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:41.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:41.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:42 np0005595445 podman[237496]: 2026-01-26 10:15:42.351636897 +0000 UTC m=+0.118254794 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 26 05:15:42 np0005595445 nova_compute[226322]: 2026-01-26 10:15:42.401 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:43.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:15:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:43.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:15:44 np0005595445 nova_compute[226322]: 2026-01-26 10:15:44.896 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:45.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:45.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:47 np0005595445 nova_compute[226322]: 2026-01-26 10:15:47.449 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:47.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:47.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:49.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:49 np0005595445 nova_compute[226322]: 2026-01-26 10:15:49.899 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:50 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:15:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:51.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:51.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:52 np0005595445 nova_compute[226322]: 2026-01-26 10:15:52.455 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:53.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:53.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.939 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:15:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:15:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:15:53.945 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:15:54 np0005595445 nova_compute[226322]: 2026-01-26 10:15:54.949 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:55.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:55.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:15:57 np0005595445 podman[237584]: 2026-01-26 10:15:57.302194206 +0000 UTC m=+0.074428295 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:15:57 np0005595445 nova_compute[226322]: 2026-01-26 10:15:57.492 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:15:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:57.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:15:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:15:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:15:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:15:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:15:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:15:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:15:59.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:15:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:15:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:15:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:15:59.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:15:59 np0005595445 nova_compute[226322]: 2026-01-26 10:15:59.988 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:01.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:01.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:02 np0005595445 nova_compute[226322]: 2026-01-26 10:16:02.493 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:03.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:03.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:04 np0005595445 nova_compute[226322]: 2026-01-26 10:16:04.990 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:05.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:05.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:07 np0005595445 nova_compute[226322]: 2026-01-26 10:16:07.545 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 05:16:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:07.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 05:16:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:07.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:09.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:09.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:09 np0005595445 nova_compute[226322]: 2026-01-26 10:16:09.991 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:11 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:12 np0005595445 nova_compute[226322]: 2026-01-26 10:16:12.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:12 np0005595445 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 05:16:12 np0005595445 podman[237637]: 2026-01-26 10:16:12.914582369 +0000 UTC m=+0.110295707 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 05:16:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:13.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:14 np0005595445 nova_compute[226322]: 2026-01-26 10:16:14.994 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:15.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:16 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:17 np0005595445 nova_compute[226322]: 2026-01-26 10:16:17.552 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:17.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:17.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:19.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:19 np0005595445 nova_compute[226322]: 2026-01-26 10:16:19.996 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:21.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:21 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:21.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:22 np0005595445 nova_compute[226322]: 2026-01-26 10:16:22.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:23.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:23.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:24 np0005595445 nova_compute[226322]: 2026-01-26 10:16:24.997 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:25.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:25.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.556 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:27.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.866 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.866 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.881 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.948 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.949 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.957 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 26 05:16:27 np0005595445 nova_compute[226322]: 2026-01-26 10:16:27.958 226326 INFO nova.compute.claims [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Jan 26 05:16:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.065 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:28 np0005595445 podman[237717]: 2026-01-26 10:16:28.275367724 +0000 UTC m=+0.054182343 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 26 05:16:28 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:16:28 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2689113532' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.502 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.509 226326 DEBUG nova.compute.provider_tree [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.526 226326 DEBUG nova.scheduler.client.report [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.546 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.547 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.603 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.603 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.638 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.663 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.771 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.772 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.772 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating image(s)#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.796 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.822 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.845 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.849 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.905 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.906 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "d81880e926e175d0cc7241caa7cc18231a8a289c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.906 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.907 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "d81880e926e175d0cc7241caa7cc18231a8a289c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.933 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:28 np0005595445 nova_compute[226322]: 2026-01-26 10:16:28.936 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c b040151a-46d9-4685-84c4-316c2d7feedb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.177 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/d81880e926e175d0cc7241caa7cc18231a8a289c b040151a-46d9-4685-84c4-316c2d7feedb_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.241s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.234 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] resizing rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.264 226326 DEBUG nova.policy [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1208d3e25b940ea93fe76884c7a53db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.313 226326 DEBUG nova.objects.instance [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'migration_context' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.329 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.330 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Ensure instance console log exists: /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.331 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.331 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.332 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 05:16:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:29.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:29 np0005595445 nova_compute[226322]: 2026-01-26 10:16:29.999 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:30 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:30.481 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:16:30 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:30.482 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:16:30 np0005595445 nova_compute[226322]: 2026-01-26 10:16:30.482 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:30 np0005595445 nova_compute[226322]: 2026-01-26 10:16:30.589 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Successfully created port: e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.521 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Successfully updated port: e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.539 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.654 226326 DEBUG nova.compute.manager [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.655 226326 DEBUG nova.compute.manager [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.655 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:16:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:31.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:31 np0005595445 nova_compute[226322]: 2026-01-26 10:16:31.746 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 26 05:16:31 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:31.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.558 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.756 226326 DEBUG nova.network.neutron [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.845 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance network_info: |[{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.846 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.849 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start _get_guest_xml network_info=[{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'device_name': '/dev/vda', 'guest_format': None, 'size': 0, 'image_id': '6789692f-fc1f-4efa-ae75-dcc13be695ef'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.854 226326 WARNING nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.859 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.860 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.863 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.864 226326 DEBUG nova.virt.libvirt.host [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.864 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.865 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T10:05:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='57e1601b-dbfa-4d3b-8b96-27302e4a7a06',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T10:05:39Z,direct_url=<?>,disk_format='qcow2',id=6789692f-fc1f-4efa-ae75-dcc13be695ef,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3ff3fa2a5531460b993c609589aa545d',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T10:05:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.865 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.866 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.867 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.868 226326 DEBUG nova.virt.hardware [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.871 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:32 np0005595445 nova_compute[226322]: 2026-01-26 10:16:32.893 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:16:33 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1830823020' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.370 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.397 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.401 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:33.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:33 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 26 05:16:33 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3710592775' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 26 05:16:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:33.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.836 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.837 226326 DEBUG nova.virt.libvirt.vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:16:28Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.838 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.839 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.840 226326 DEBUG nova.objects.instance [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.859 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] End _get_guest_xml xml=<domain type="kvm">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <uuid>b040151a-46d9-4685-84c4-316c2d7feedb</uuid>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <name>instance-0000000c</name>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <memory>131072</memory>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <vcpu>1</vcpu>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <metadata>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:name>tempest-TestNetworkBasicOps-server-1174728540</nova:name>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:creationTime>2026-01-26 10:16:32</nova:creationTime>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:flavor name="m1.nano">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:memory>128</nova:memory>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:disk>1</nova:disk>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:swap>0</nova:swap>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:ephemeral>0</nova:ephemeral>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:vcpus>1</nova:vcpus>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </nova:flavor>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:owner>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:user uuid="c1208d3e25b940ea93fe76884c7a53db">tempest-TestNetworkBasicOps-966559857-project-member</nova:user>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:project uuid="6ed221b375a44fc2bb2a8f232c5446e7">tempest-TestNetworkBasicOps-966559857</nova:project>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </nova:owner>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:root type="image" uuid="6789692f-fc1f-4efa-ae75-dcc13be695ef"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <nova:ports>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <nova:port uuid="e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        </nova:port>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </nova:ports>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </nova:instance>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </metadata>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <sysinfo type="smbios">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <system>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="manufacturer">RDO</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="product">OpenStack Compute</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="serial">b040151a-46d9-4685-84c4-316c2d7feedb</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="uuid">b040151a-46d9-4685-84c4-316c2d7feedb</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <entry name="family">Virtual Machine</entry>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </system>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </sysinfo>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <os>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <boot dev="hd"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <smbios mode="sysinfo"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </os>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <features>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <acpi/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <apic/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <vmcoreinfo/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </features>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <clock offset="utc">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <timer name="pit" tickpolicy="delay"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <timer name="hpet" present="no"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </clock>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <cpu mode="host-model" match="exact">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <topology sockets="1" cores="1" threads="1"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </cpu>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  <devices>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <disk type="network" device="disk">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/b040151a-46d9-4685-84c4-316c2d7feedb_disk">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <target dev="vda" bus="virtio"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <disk type="network" device="cdrom">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <driver type="raw" cache="none"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <source protocol="rbd" name="vms/b040151a-46d9-4685-84c4-316c2d7feedb_disk.config">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.100" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.102" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <host name="192.168.122.101" port="6789"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </source>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <auth username="openstack">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:        <secret type="ceph" uuid="1a70b85d-e3fd-5814-8a6a-37ea00fcae30"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      </auth>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <target dev="sda" bus="sata"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </disk>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <interface type="ethernet">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <mac address="fa:16:3e:97:1d:9d"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <driver name="vhost" rx_queue_size="512"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <mtu size="1442"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <target dev="tape8f0e0cf-36"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </interface>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <serial type="pty">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <log file="/var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/console.log" append="off"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </serial>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <video>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <model type="virtio"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </video>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <input type="tablet" bus="usb"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <rng model="virtio">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <backend model="random">/dev/urandom</backend>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </rng>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="pci" model="pcie-root-port"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <controller type="usb" index="0"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    <memballoon model="virtio">
Jan 26 05:16:33 np0005595445 nova_compute[226322]:      <stats period="10"/>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:    </memballoon>
Jan 26 05:16:33 np0005595445 nova_compute[226322]:  </devices>
Jan 26 05:16:33 np0005595445 nova_compute[226322]: </domain>
Jan 26 05:16:33 np0005595445 nova_compute[226322]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Preparing to wait for external event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.861 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.862 226326 DEBUG nova.virt.libvirt.vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T10:16:28Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.862 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.863 226326 DEBUG nova.network.os_vif_util [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.863 226326 DEBUG os_vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.864 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.864 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.865 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.867 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.868 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8f0e0cf-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.869 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape8f0e0cf-36, col_values=(('external_ids', {'iface-id': 'e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:1d:9d', 'vm-uuid': 'b040151a-46d9-4685-84c4-316c2d7feedb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:33 np0005595445 NetworkManager[49073]: <info>  [1769422593.8714] manager: (tape8f0e0cf-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.873 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.878 226326 INFO os_vif [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36')#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.920 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] No VIF found with MAC fa:16:3e:97:1d:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.921 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Using config drive#033[00m
Jan 26 05:16:33 np0005595445 nova_compute[226322]: 2026-01-26 10:16:33.944 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.229 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.230 226326 DEBUG nova.network.neutron [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.252 226326 DEBUG oslo_concurrency.lockutils [req-21652baa-e1ad-4586-8489-7f8a7af32e45 req-439277ee-8eb6-4654-8f70-9ea8cf0a7a6c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.337 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Creating config drive at /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.342 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mg2qu9r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.467 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_mg2qu9r" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.497 226326 DEBUG nova.storage.rbd_utils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] rbd image b040151a-46d9-4685-84c4-316c2d7feedb_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.500 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config b040151a-46d9-4685-84c4-316c2d7feedb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.725 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.777 226326 DEBUG oslo_concurrency.processutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config b040151a-46d9-4685-84c4-316c2d7feedb_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.778 226326 INFO nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deleting local config drive /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb/disk.config because it was imported into RBD.#033[00m
Jan 26 05:16:34 np0005595445 systemd[1]: Starting libvirt secret daemon...
Jan 26 05:16:34 np0005595445 systemd[1]: Started libvirt secret daemon.
Jan 26 05:16:34 np0005595445 kernel: tape8f0e0cf-36: entered promiscuous mode
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.8638] manager: (tape8f0e0cf-36): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 05:16:34 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:34Z|00067|binding|INFO|Claiming lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for this chassis.
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.863 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:34Z|00068|binding|INFO|e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae: Claiming fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.868 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.870 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.8740] manager: (patch-br-int-to-provnet-94d9950f-5cf2-4813-9455-dd14377245f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.872 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.8771] manager: (patch-provnet-94d9950f-5cf2-4813-9455-dd14377245f4-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.888 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:1d:9d 10.100.0.12'], port_security=['fa:16:3e:97:1d:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b040151a-46d9-4685-84c4-316c2d7feedb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5cce9ca3-082e-4a27-8023-5db6a50012d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db1eda71-392f-4d4b-8724-78530674037e, chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.889 143326 INFO neutron.agent.ovn.metadata.agent [-] Port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae in datapath 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e bound to our chassis#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.891 143326 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e#033[00m
Jan 26 05:16:34 np0005595445 systemd-machined[194876]: New machine qemu-5-instance-0000000c.
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.901 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[978aa8cd-3d1d-410a-898f-1e45e9f8448e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.902 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bff64e0-61 in ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bff64e0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[af69d122-a6fa-457f-b88c-b7798049706e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.904 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a5580ffd-bcee-433e-9fad-011901dddcc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.915 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[baad70b1-26f6-415b-8aad-184e96b07707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.939 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e821238e-bcff-4d7a-af47-f75acf9d48a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 systemd-udevd[238068]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.947 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.9531] device (tape8f0e0cf-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.9546] device (tape8f0e0cf-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.956 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.965 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9e68db-2735-4ba9-8b3b-2f88f20fdad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:34Z|00069|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae ovn-installed in OVS
Jan 26 05:16:34 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:34Z|00070|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae up in Southbound
Jan 26 05:16:34 np0005595445 nova_compute[226322]: 2026-01-26 10:16:34.968 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:34 np0005595445 systemd-udevd[238073]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 05:16:34 np0005595445 NetworkManager[49073]: <info>  [1769422594.9717] manager: (tap9bff64e0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.970 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a78a08-b02f-4c9e-8333-2ab2e889f0ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.994 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[501988c9-7219-404b-9b44-ee73879df3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:34 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:34.996 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0da81f-271c-4b1b-990d-044237bdcd51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:35 np0005595445 NetworkManager[49073]: <info>  [1769422595.0153] device (tap9bff64e0-60): carrier: link connected
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.019 229936 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7ea17c-f705-49a0-99d7-19fb7e193135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.033 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7103e3-c982-4a55-a830-1875a6bf0706]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bff64e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455558, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238098, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.047 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1288446b-80eb-4344-a9e7-e759dc3cc7d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:7f67'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455558, 'tstamp': 455558}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238099, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.063 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[749b439e-1b97-4e2b-b96a-4e5657f7d643]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bff64e0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:7f:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455558, 'reachable_time': 18566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238100, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.087 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1a46c1-2828-4c76-9d39-4647c47b244c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.137 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1f8a40-e61c-422e-a3eb-97bc251f1344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.138 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bff64e0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.138 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.139 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bff64e0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.175 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:35 np0005595445 NetworkManager[49073]: <info>  [1769422595.1760] manager: (tap9bff64e0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 05:16:35 np0005595445 kernel: tap9bff64e0-60: entered promiscuous mode
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.178 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bff64e0-60, col_values=(('external_ids', {'iface-id': '58fa2dc8-9a67-4ebd-8c74-a3dee5be3d64'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.179 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:35 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:35Z|00071|binding|INFO|Releasing lport 58fa2dc8-9a67-4ebd-8c74-a3dee5be3d64 from this chassis (sb_readonly=0)
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.193 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.194 143326 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.194 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[e1896926-c363-455d-8e96-e3b281e2aeba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.195 143326 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: global
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    log         /dev/log local0 debug
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    log-tag     haproxy-metadata-proxy-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    user        root
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    group       root
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    maxconn     1024
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    pidfile     /var/lib/neutron/external/pids/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.pid.haproxy
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    daemon
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: defaults
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    log global
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    mode http
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    option httplog
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    option dontlognull
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    option http-server-close
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    option forwardfor
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    retries                 3
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    timeout http-request    30s
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    timeout connect         30s
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    timeout client          32s
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    timeout server          32s
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    timeout http-keep-alive 30s
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: listen listener
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    bind 169.254.169.254:80
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    server metadata /var/lib/neutron/metadata_proxy
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]:    http-request add-header X-OVN-Network-ID 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.196 143326 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'env', 'PROCESS_TAG=haproxy-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bff64e0-694f-4b2d-b4b5-5e3b1d94460e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.437 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.4367375, b040151a-46d9-4685-84c4-316c2d7feedb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.438 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Started (Lifecycle Event)#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.465 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.471 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.4368613, b040151a-46d9-4685-84c4-316c2d7feedb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.471 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Paused (Lifecycle Event)#033[00m
Jan 26 05:16:35 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:35.485 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.537 226326 DEBUG nova.compute.manager [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.538 226326 DEBUG oslo_concurrency.lockutils [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.539 226326 DEBUG nova.compute.manager [req-7a5e3ded-2640-460d-9dff-843b6d3062c9 req-fbd04b0f-84ca-4f86-b88a-fa49da3fb062 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Processing event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.539 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.543 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.545 226326 INFO nova.virt.libvirt.driver [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance spawned successfully.#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.545 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 26 05:16:35 np0005595445 podman[238174]: 2026-01-26 10:16:35.551525709 +0000 UTC m=+0.049210247 container create ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.557 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.564 226326 DEBUG nova.virt.driver [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] Emitting event <LifecycleEvent: 1769422595.5424285, b040151a-46d9-4685-84c4-316c2d7feedb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.564 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Resumed (Lifecycle Event)#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.587 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.588 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.589 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.589 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.590 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.591 226326 DEBUG nova.virt.libvirt.driver [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.595 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.598 226326 DEBUG nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 26 05:16:35 np0005595445 systemd[1]: Started libpod-conmon-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope.
Jan 26 05:16:35 np0005595445 podman[238174]: 2026-01-26 10:16:35.525733864 +0000 UTC m=+0.023418442 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 26 05:16:35 np0005595445 systemd[1]: Started libcrun container.
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.626 226326 INFO nova.compute.manager [None req-139b4a87-d4e8-490b-b83c-512225a40eab - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 26 05:16:35 np0005595445 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78225d3fda0ba19bc05bebf3fc0755524da6232cfdd0544b56f1d421e656d4c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 05:16:35 np0005595445 podman[238174]: 2026-01-26 10:16:35.703884524 +0000 UTC m=+0.201569142 container init ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 05:16:35 np0005595445 podman[238174]: 2026-01-26 10:16:35.710749203 +0000 UTC m=+0.208433751 container start ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 26 05:16:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:35.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.724 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:35 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : New worker (238196) forked
Jan 26 05:16:35 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : Loading success.
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.732 226326 INFO nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 6.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.733 226326 DEBUG nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:16:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.842 226326 INFO nova.compute.manager [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 7.92 seconds to build instance.#033[00m
Jan 26 05:16:35 np0005595445 nova_compute[226322]: 2026-01-26 10:16:35.873 226326 DEBUG oslo_concurrency.lockutils [None req-fc64d6a8-9499-4da9-b256-a314157613f5 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:36 np0005595445 nova_compute[226322]: 2026-01-26 10:16:36.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:36 np0005595445 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:36 np0005595445 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:16:36 np0005595445 nova_compute[226322]: 2026-01-26 10:16:36.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:16:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.212 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.213 226326 DEBUG nova.objects.instance [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.665 226326 DEBUG nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.666 226326 DEBUG oslo_concurrency.lockutils [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.667 226326 DEBUG nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:16:37 np0005595445 nova_compute[226322]: 2026-01-26 10:16:37.667 226326 WARNING nova.compute.manager [req-2661df50-3103-4f8a-a995-04e222e11c3a req-565d1cb5-ba26-4ad4-9ead-33933eb521a0 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received unexpected event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with vm_state active and task_state None.#033[00m
Jan 26 05:16:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:37.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:37.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:38 np0005595445 nova_compute[226322]: 2026-01-26 10:16:38.872 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.544 226326 DEBUG nova.network.neutron [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.565 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.566 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:16:39 np0005595445 nova_compute[226322]: 2026-01-26 10:16:39.567 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:39.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:39.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:40 np0005595445 nova_compute[226322]: 2026-01-26 10:16:40.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:41 np0005595445 nova_compute[226322]: 2026-01-26 10:16:41.695 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:16:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:41.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.006 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.007 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.008 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.354 226326 DEBUG nova.compute.manager [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG nova.compute.manager [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.355 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.356 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:16:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:16:42 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3699931200' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:16:42 np0005595445 nova_compute[226322]: 2026-01-26 10:16:42.480 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.032 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.033 226326 DEBUG nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] skipping disk for instance-0000000c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.236 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.237 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4670MB free_disk=59.92185592651367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.237 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.238 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:43 np0005595445 podman[238231]: 2026-01-26 10:16:43.406163011 +0000 UTC m=+0.161907938 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.440 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Instance b040151a-46d9-4685-84c4-316c2d7feedb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.442 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.442 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.493 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.494 226326 DEBUG nova.network.neutron [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.570 226326 DEBUG oslo_concurrency.lockutils [req-119c5248-4dfb-4f64-b664-5fca0fb5b06c req-1d2e32c8-3983-49a0-bb03-00e7b3cbd844 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.610 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:16:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:43.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:43.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:43 np0005595445 nova_compute[226322]: 2026-01-26 10:16:43.874 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:16:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3339811741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:16:44 np0005595445 nova_compute[226322]: 2026-01-26 10:16:44.116 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:16:44 np0005595445 nova_compute[226322]: 2026-01-26 10:16:44.120 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:16:44 np0005595445 nova_compute[226322]: 2026-01-26 10:16:44.221 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:16:44 np0005595445 nova_compute[226322]: 2026-01-26 10:16:44.264 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:16:44 np0005595445 nova_compute[226322]: 2026-01-26 10:16:44.265 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:45 np0005595445 nova_compute[226322]: 2026-01-26 10:16:45.017 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 05:16:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:45.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 05:16:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:45.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:16:46 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:16:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:16:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:16:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:47.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:47.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:48 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:48Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 05:16:48 np0005595445 ovn_controller[133670]: 2026-01-26T10:16:48Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:1d:9d 10.100.0.12
Jan 26 05:16:48 np0005595445 nova_compute[226322]: 2026-01-26 10:16:48.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:49.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:49.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:50 np0005595445 nova_compute[226322]: 2026-01-26 10:16:50.018 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:51 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:16:51 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:16:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:51.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:51.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:53.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:53.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:53 np0005595445 nova_compute[226322]: 2026-01-26 10:16:53.888 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.940 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:16:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:16:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:16:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:16:55 np0005595445 nova_compute[226322]: 2026-01-26 10:16:55.066 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:55.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:55.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:16:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:16:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:57.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:57.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:16:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:16:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:16:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:16:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:16:58 np0005595445 nova_compute[226322]: 2026-01-26 10:16:58.891 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:16:59 np0005595445 podman[238417]: 2026-01-26 10:16:59.337895497 +0000 UTC m=+0.106639256 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 05:16:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:16:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:16:59.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:16:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:16:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:16:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:16:59.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:00 np0005595445 nova_compute[226322]: 2026-01-26 10:17:00.067 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:01.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:01.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:03.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:03.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:03 np0005595445 nova_compute[226322]: 2026-01-26 10:17:03.893 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.800 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.801 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.802 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.803 226326 INFO nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Terminating instance#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.804 226326 DEBUG nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 26 05:17:04 np0005595445 kernel: tape8f0e0cf-36 (unregistering): left promiscuous mode
Jan 26 05:17:04 np0005595445 NetworkManager[49073]: <info>  [1769422624.8559] device (tape8f0e0cf-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 05:17:04 np0005595445 ovn_controller[133670]: 2026-01-26T10:17:04Z|00072|binding|INFO|Releasing lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae from this chassis (sb_readonly=0)
Jan 26 05:17:04 np0005595445 ovn_controller[133670]: 2026-01-26T10:17:04Z|00073|binding|INFO|Setting lport e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae down in Southbound
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.902 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:04 np0005595445 ovn_controller[133670]: 2026-01-26T10:17:04Z|00074|binding|INFO|Removing iface tape8f0e0cf-36 ovn-installed in OVS
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.904 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:04 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.911 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:1d:9d 10.100.0.12'], port_security=['fa:16:3e:97:1d:9d 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b040151a-46d9-4685-84c4-316c2d7feedb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed221b375a44fc2bb2a8f232c5446e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5cce9ca3-082e-4a27-8023-5db6a50012d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db1eda71-392f-4d4b-8724-78530674037e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>], logical_port=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f24b0cb3640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:17:04 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.912 143326 INFO neutron.agent.ovn.metadata.agent [-] Port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae in datapath 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e unbound from our chassis#033[00m
Jan 26 05:17:04 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.913 143326 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 26 05:17:04 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.914 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e40cd8-d02c-4ac5-b4a6-5eb34bfd1bfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:04 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:04.915 143326 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e namespace which is not needed anymore#033[00m
Jan 26 05:17:04 np0005595445 nova_compute[226322]: 2026-01-26 10:17:04.920 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:04 np0005595445 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 26 05:17:04 np0005595445 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 14.093s CPU time.
Jan 26 05:17:04 np0005595445 systemd-machined[194876]: Machine qemu-5-instance-0000000c terminated.
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.039 226326 INFO nova.virt.libvirt.driver [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Instance destroyed successfully.#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.040 226326 DEBUG nova.objects.instance [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lazy-loading 'resources' on Instance uuid b040151a-46d9-4685-84c4-316c2d7feedb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 26 05:17:05 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : haproxy version is 2.8.14-c23fe91
Jan 26 05:17:05 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [NOTICE]   (238194) : path to executable is /usr/sbin/haproxy
Jan 26 05:17:05 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [WARNING]  (238194) : Exiting Master process...
Jan 26 05:17:05 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [ALERT]    (238194) : Current worker (238196) exited with code 143 (Terminated)
Jan 26 05:17:05 np0005595445 neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e[238190]: [WARNING]  (238194) : All workers exited. Exiting... (0)
Jan 26 05:17:05 np0005595445 systemd[1]: libpod-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope: Deactivated successfully.
Jan 26 05:17:05 np0005595445 podman[238464]: 2026-01-26 10:17:05.054030205 +0000 UTC m=+0.052063434 container died ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.069 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.305 226326 DEBUG nova.virt.libvirt.vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-26T10:16:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1174728540',display_name='tempest-TestNetworkBasicOps-server-1174728540',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1174728540',id=12,image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB+EGw3c7uXEi4OTNwulYSgVH+l4n13lnOEqRefy0kW9/M5qFO/QglqfVEv83QxC3p6/WVWNJKRmKF5HdJHv8FbQly+oOPz0zT0wKKx+uweCnuuDtsTED/V9sqhB977UJw==',key_name='tempest-TestNetworkBasicOps-1514805345',keypairs=<?>,launch_index=0,launched_at=2026-01-26T10:16:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ed221b375a44fc2bb2a8f232c5446e7',ramdisk_id='',reservation_id='r-hl8103a0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6789692f-fc1f-4efa-ae75-dcc13be695ef',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-966559857',owner_user_name='tempest-TestNetworkBasicOps-966559857-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T10:16:35Z,user_data=None,user_id='c1208d3e25b940ea93fe76884c7a53db',uuid=b040151a-46d9-4685-84c4-316c2d7feedb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.306 226326 DEBUG nova.network.os_vif_util [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converting VIF {"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.306 226326 DEBUG nova.network.os_vif_util [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.307 226326 DEBUG os_vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.308 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.308 226326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8f0e0cf-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.310 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.313 226326 INFO os_vif [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:1d:9d,bridge_name='br-int',has_traffic_filtering=True,id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae,network=Network(9bff64e0-694f-4b2d-b4b5-5e3b1d94460e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape8f0e0cf-36')#033[00m
Jan 26 05:17:05 np0005595445 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a-userdata-shm.mount: Deactivated successfully.
Jan 26 05:17:05 np0005595445 systemd[1]: var-lib-containers-storage-overlay-78225d3fda0ba19bc05bebf3fc0755524da6232cfdd0544b56f1d421e656d4c8-merged.mount: Deactivated successfully.
Jan 26 05:17:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:05.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.826 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG oslo_concurrency.lockutils [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.827 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:17:05 np0005595445 nova_compute[226322]: 2026-01-26 10:17:05.828 226326 DEBUG nova.compute.manager [req-95e36ca9-58bb-4cdc-8282-20d365e5cabd req-ed5aef2f-5856-492c-b86c-f85d004b5e70 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-unplugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 26 05:17:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:05.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.026 226326 DEBUG nova.compute.manager [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.026 226326 DEBUG nova.compute.manager [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing instance network info cache due to event network-changed-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquired lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.027 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Refreshing network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 26 05:17:06 np0005595445 podman[238464]: 2026-01-26 10:17:06.031232945 +0000 UTC m=+1.029266174 container cleanup ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:17:06 np0005595445 systemd[1]: libpod-conmon-ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a.scope: Deactivated successfully.
Jan 26 05:17:06 np0005595445 podman[238551]: 2026-01-26 10:17:06.124607349 +0000 UTC m=+0.065644216 container remove ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.136 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[284337f3-ec71-428e-ab97-6b66626eb86c]: (4, ('Mon Jan 26 10:17:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e (ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a)\nec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a\nMon Jan 26 10:17:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e (ec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a)\nec686f370933e73f061d9a71955dc41f5fb9f6328a46d883ec7954f5f148d74a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.139 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b9518e-5438-42d9-a4a4-4209932f3089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.141 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bff64e0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.144 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:06 np0005595445 kernel: tap9bff64e0-60: left promiscuous mode
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.147 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.150 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9aa140-895a-40f2-af9a-18c9047da115]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.162 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.164 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[a8816bea-5ea9-4304-8c60-686badd6ce65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.165 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4f472e-3efd-4fad-afea-badcbd26a7e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.192 229912 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ddd192-cab8-4916-9476-c4657d0105b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455552, 'reachable_time': 30589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238567, 'error': None, 'target': 'ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.196 143615 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bff64e0-694f-4b2d-b4b5-5e3b1d94460e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 26 05:17:06 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:06.196 143615 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d3ed7b-e77f-45b2-83ca-5e000a1e77e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 26 05:17:06 np0005595445 systemd[1]: run-netns-ovnmeta\x2d9bff64e0\x2d694f\x2d4b2d\x2db4b5\x2d5e3b1d94460e.mount: Deactivated successfully.
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.393 226326 INFO nova.virt.libvirt.driver [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deleting instance files /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb_del#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.394 226326 INFO nova.virt.libvirt.driver [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deletion of /var/lib/nova/instances/b040151a-46d9-4685-84c4-316c2d7feedb_del complete#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.447 226326 INFO nova.compute.manager [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 1.64 seconds to destroy the instance on the hypervisor.#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG oslo.service.loopingcall [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 26 05:17:06 np0005595445 nova_compute[226322]: 2026-01-26 10:17:06.448 226326 DEBUG nova.network.neutron [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 26 05:17:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:07 np0005595445 nova_compute[226322]: 2026-01-26 10:17:07.564 226326 DEBUG nova.network.neutron [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:17:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:07.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:07 np0005595445 nova_compute[226322]: 2026-01-26 10:17:07.846 226326 INFO nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Took 1.40 seconds to deallocate network for instance.#033[00m
Jan 26 05:17:07 np0005595445 nova_compute[226322]: 2026-01-26 10:17:07.857 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updated VIF entry in instance network info cache for port e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 26 05:17:07 np0005595445 nova_compute[226322]: 2026-01-26 10:17:07.858 226326 DEBUG nova.network.neutron [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [{"id": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "address": "fa:16:3e:97:1d:9d", "network": {"id": "9bff64e0-694f-4b2d-b4b5-5e3b1d94460e", "bridge": "br-int", "label": "tempest-network-smoke--2141113135", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed221b375a44fc2bb2a8f232c5446e7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape8f0e0cf-36", "ovs_interfaceid": "e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:17:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:07.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.052 226326 DEBUG nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.053 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Acquiring lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.053 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.054 226326 DEBUG oslo_concurrency.lockutils [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.054 226326 DEBUG nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] No waiting events found dispatching network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.055 226326 WARNING nova.compute.manager [req-876d7fa6-f212-49a1-b153-9f872622ea70 req-59e3e6aa-9bd3-4a89-a4c0-e9b85c1645b3 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received unexpected event network-vif-plugged-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae for instance with vm_state active and task_state deleting.#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.092 226326 DEBUG oslo_concurrency.lockutils [req-9332e68d-7895-479b-937d-a13200fb5d82 req-723294d8-0094-4732-9a05-ba6b5572064c b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] Releasing lock "refresh_cache-b040151a-46d9-4685-84c4-316c2d7feedb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.101 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.102 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.176 226326 DEBUG oslo_concurrency.processutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.271 226326 DEBUG nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Received event network-vif-deleted-e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.272 226326 INFO nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Neutron deleted interface e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae; detaching it from the instance and deleting it from the info cache#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.272 226326 DEBUG nova.network.neutron [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.293 226326 DEBUG nova.compute.manager [req-d886c676-4eac-449e-b225-63b6578784ef req-065daaa1-5dab-43eb-9f7c-fe1991596bc7 b3cedad3bffb466c8c89f0c66461ccc7 d522de7bb1e84f808e55320745abb962 - - default default] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Detach interface failed, port_id=e8f0e0cf-363c-4f2d-90d8-8c1b9cb9aeae, reason: Instance b040151a-46d9-4685-84c4-316c2d7feedb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 26 05:17:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:17:08 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1907053133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.613 226326 DEBUG oslo_concurrency.processutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.618 226326 DEBUG nova.compute.provider_tree [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.822 226326 DEBUG nova.scheduler.client.report [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.861 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:08 np0005595445 nova_compute[226322]: 2026-01-26 10:17:08.896 226326 INFO nova.scheduler.client.report [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Deleted allocations for instance b040151a-46d9-4685-84c4-316c2d7feedb#033[00m
Jan 26 05:17:09 np0005595445 nova_compute[226322]: 2026-01-26 10:17:09.208 226326 DEBUG oslo_concurrency.lockutils [None req-eb91376d-376c-4e86-b60e-89acb6231468 c1208d3e25b940ea93fe76884c7a53db 6ed221b375a44fc2bb2a8f232c5446e7 - - default default] Lock "b040151a-46d9-4685-84c4-316c2d7feedb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:09.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:09.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:10 np0005595445 nova_compute[226322]: 2026-01-26 10:17:10.070 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:10 np0005595445 nova_compute[226322]: 2026-01-26 10:17:10.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:11.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:11.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:13.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:13.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:14 np0005595445 podman[238598]: 2026-01-26 10:17:14.297879622 +0000 UTC m=+0.079418122 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 26 05:17:15 np0005595445 nova_compute[226322]: 2026-01-26 10:17:15.075 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:15 np0005595445 nova_compute[226322]: 2026-01-26 10:17:15.312 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:15.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:15.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:17.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:17.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:19.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:20 np0005595445 nova_compute[226322]: 2026-01-26 10:17:20.038 226326 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769422625.0362768, b040151a-46d9-4685-84c4-316c2d7feedb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 26 05:17:20 np0005595445 nova_compute[226322]: 2026-01-26 10:17:20.038 226326 INFO nova.compute.manager [-] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] VM Stopped (Lifecycle Event)#033[00m
Jan 26 05:17:20 np0005595445 nova_compute[226322]: 2026-01-26 10:17:20.061 226326 DEBUG nova.compute.manager [None req-93449595-6e98-4855-a091-dc9e11be79d5 - - - - - -] [instance: b040151a-46d9-4685-84c4-316c2d7feedb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 26 05:17:20 np0005595445 nova_compute[226322]: 2026-01-26 10:17:20.076 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:20 np0005595445 nova_compute[226322]: 2026-01-26 10:17:20.314 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:21.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:21.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:23 np0005595445 nova_compute[226322]: 2026-01-26 10:17:23.737 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:23 np0005595445 nova_compute[226322]: 2026-01-26 10:17:23.848 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:17:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:25 np0005595445 nova_compute[226322]: 2026-01-26 10:17:25.103 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:25 np0005595445 nova_compute[226322]: 2026-01-26 10:17:25.315 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:25.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:27.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:29.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:30 np0005595445 nova_compute[226322]: 2026-01-26 10:17:30.105 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:30 np0005595445 podman[238662]: 2026-01-26 10:17:30.274938641 +0000 UTC m=+0.049825171 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 05:17:30 np0005595445 nova_compute[226322]: 2026-01-26 10:17:30.316 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:31.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:32 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:32.054 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:17:32 np0005595445 nova_compute[226322]: 2026-01-26 10:17:32.054 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:32 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:32.055 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:17:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:33.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:33.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:17:35 np0005595445 nova_compute[226322]: 2026-01-26 10:17:35.106 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:35 np0005595445 nova_compute[226322]: 2026-01-26 10:17:35.318 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:35.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:35.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:36 np0005595445 nova_compute[226322]: 2026-01-26 10:17:36.256 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:36 np0005595445 nova_compute[226322]: 2026-01-26 10:17:36.257 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:37 np0005595445 nova_compute[226322]: 2026-01-26 10:17:37.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:37 np0005595445 nova_compute[226322]: 2026-01-26 10:17:37.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:37.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:37.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.701 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:38 np0005595445 nova_compute[226322]: 2026-01-26 10:17:38.702 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:17:39 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:39.058 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:17:39 np0005595445 nova_compute[226322]: 2026-01-26 10:17:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:39.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:39.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:40 np0005595445 nova_compute[226322]: 2026-01-26 10:17:40.108 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:40 np0005595445 nova_compute[226322]: 2026-01-26 10:17:40.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:40 np0005595445 nova_compute[226322]: 2026-01-26 10:17:40.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:41.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:41.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:43 np0005595445 nova_compute[226322]: 2026-01-26 10:17:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:17:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:43.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:43.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.079 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.080 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.081 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1005788338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.920546) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664920916, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2389, "num_deletes": 251, "total_data_size": 6512990, "memory_usage": 6591552, "flush_reason": "Manual Compaction"}
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 26 05:17:44 np0005595445 nova_compute[226322]: 2026-01-26 10:17:44.928 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.847s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664952430, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4191624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31238, "largest_seqno": 33621, "table_properties": {"data_size": 4181901, "index_size": 6153, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20235, "raw_average_key_size": 20, "raw_value_size": 4162471, "raw_average_value_size": 4221, "num_data_blocks": 264, "num_entries": 986, "num_filter_entries": 986, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422455, "oldest_key_time": 1769422455, "file_creation_time": 1769422664, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 31766 microseconds, and 13285 cpu microseconds.
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.952609) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4191624 bytes OK
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.952676) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954659) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954674) EVENT_LOG_v1 {"time_micros": 1769422664954669, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.954725) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6502424, prev total WAL file size 6502424, number of live WAL files 2.
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.956686) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4093KB)], [60(11MB)]
Jan 26 05:17:44 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422664956769, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16529384, "oldest_snapshot_seqno": -1}
Jan 26 05:17:45 np0005595445 podman[238713]: 2026-01-26 10:17:45.055439048 +0000 UTC m=+0.085536202 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6227 keys, 14392038 bytes, temperature: kUnknown
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665071226, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14392038, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14350719, "index_size": 24633, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 159448, "raw_average_key_size": 25, "raw_value_size": 14238855, "raw_average_value_size": 2286, "num_data_blocks": 990, "num_entries": 6227, "num_filter_entries": 6227, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422664, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.071521) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14392038 bytes
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.073413) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.3 rd, 125.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 11.8 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 6748, records dropped: 521 output_compression: NoCompression
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.073434) EVENT_LOG_v1 {"time_micros": 1769422665073424, "job": 36, "event": "compaction_finished", "compaction_time_micros": 114548, "compaction_time_cpu_micros": 36189, "output_level": 6, "num_output_files": 1, "total_output_size": 14392038, "num_input_records": 6748, "num_output_records": 6227, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665074534, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422665077584, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:44.956622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:17:45.077663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.100 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4847MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.102 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.111 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.165 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.165 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.182 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.243 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.243 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.258 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.283 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.313 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.328 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:17:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/338352986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.754 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.759 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.772 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.790 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:17:45 np0005595445 nova_compute[226322]: 2026-01-26 10:17:45.791 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:45.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:45.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:17:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:47.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:17:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:47.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:49.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:50 np0005595445 nova_compute[226322]: 2026-01-26 10:17:50.113 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:50 np0005595445 nova_compute[226322]: 2026-01-26 10:17:50.330 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:51.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:17:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:53.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:17:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.941 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:17:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:17:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:17:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:17:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:54.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:55 np0005595445 nova_compute[226322]: 2026-01-26 10:17:55.116 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:55 np0005595445 nova_compute[226322]: 2026-01-26 10:17:55.332 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:17:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:55.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:56.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:17:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:57.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:17:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:17:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:17:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:17:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:17:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:17:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:17:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:17:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:17:58.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:17:58 np0005595445 ovn_controller[133670]: 2026-01-26T10:17:58Z|00075|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 26 05:17:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:17:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:17:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:17:59.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:00 np0005595445 nova_compute[226322]: 2026-01-26 10:18:00.118 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:00 np0005595445 nova_compute[226322]: 2026-01-26 10:18:00.333 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:00.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:00 np0005595445 podman[238884]: 2026-01-26 10:18:00.812620916 +0000 UTC m=+0.057975967 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 26 05:18:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:01.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:02.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:03.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:04.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:04 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:18:04 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:18:05 np0005595445 nova_compute[226322]: 2026-01-26 10:18:05.119 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:05 np0005595445 nova_compute[226322]: 2026-01-26 10:18:05.335 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:05.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:06.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:07.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:08.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:09.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:10 np0005595445 nova_compute[226322]: 2026-01-26 10:18:10.122 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:10 np0005595445 nova_compute[226322]: 2026-01-26 10:18:10.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:10.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:11.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:12.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:12.552 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:18:12 np0005595445 nova_compute[226322]: 2026-01-26 10:18:12.554 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:12 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:12.554 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:18:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:13.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:14.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:15 np0005595445 nova_compute[226322]: 2026-01-26 10:18:15.125 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:15 np0005595445 podman[238961]: 2026-01-26 10:18:15.321732972 +0000 UTC m=+0.105719991 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 05:18:15 np0005595445 nova_compute[226322]: 2026-01-26 10:18:15.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:15 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:15.556 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:18:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:15.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:16.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:17.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:18.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:18:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:20 np0005595445 nova_compute[226322]: 2026-01-26 10:18:20.127 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:20 np0005595445 nova_compute[226322]: 2026-01-26 10:18:20.341 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:20.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:22.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:18:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:24.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:25 np0005595445 nova_compute[226322]: 2026-01-26 10:18:25.128 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:25 np0005595445 nova_compute[226322]: 2026-01-26 10:18:25.342 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:25.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:26.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:27.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:28.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:30 np0005595445 nova_compute[226322]: 2026-01-26 10:18:30.131 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:30 np0005595445 nova_compute[226322]: 2026-01-26 10:18:30.344 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:30.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:31 np0005595445 podman[239024]: 2026-01-26 10:18:31.292806041 +0000 UTC m=+0.075999268 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 26 05:18:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:31.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:32.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:33.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:34.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:34 np0005595445 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 05:18:35 np0005595445 nova_compute[226322]: 2026-01-26 10:18:35.133 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:35 np0005595445 nova_compute[226322]: 2026-01-26 10:18:35.346 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:35.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:36.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:36 np0005595445 nova_compute[226322]: 2026-01-26 10:18:36.792 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:36 np0005595445 nova_compute[226322]: 2026-01-26 10:18:36.792 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:37 np0005595445 nova_compute[226322]: 2026-01-26 10:18:37.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:37 np0005595445 nova_compute[226322]: 2026-01-26 10:18:37.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:37.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:38.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:38 np0005595445 nova_compute[226322]: 2026-01-26 10:18:38.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:38 np0005595445 nova_compute[226322]: 2026-01-26 10:18:38.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:18:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:18:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:39.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.135 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.348 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:40.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.714 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:18:40 np0005595445 nova_compute[226322]: 2026-01-26 10:18:40.715 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:41 np0005595445 nova_compute[226322]: 2026-01-26 10:18:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:41.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:42.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:43.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:18:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:44.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.350 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:45 np0005595445 podman[239055]: 2026-01-26 10:18:45.488910271 +0000 UTC m=+0.116889941 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.717 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:18:45 np0005595445 nova_compute[226322]: 2026-01-26 10:18:45.717 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:18:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:18:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:45.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:18:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:18:46 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3089679829' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.186 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.343 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.344 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4870MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.344 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.345 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:18:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:46.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.513 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.514 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.532 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:18:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:18:46 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3843976593' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.981 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:18:46 np0005595445 nova_compute[226322]: 2026-01-26 10:18:46.986 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:18:47 np0005595445 nova_compute[226322]: 2026-01-26 10:18:47.010 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:18:47 np0005595445 nova_compute[226322]: 2026-01-26 10:18:47.012 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:18:47 np0005595445 nova_compute[226322]: 2026-01-26 10:18:47.012 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:18:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:48.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:49.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:50 np0005595445 nova_compute[226322]: 2026-01-26 10:18:50.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:50 np0005595445 nova_compute[226322]: 2026-01-26 10:18:50.352 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:18:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:50.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:18:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:52.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:52.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.942 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:18:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:18:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:18:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:18:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:54.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:54.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:55 np0005595445 nova_compute[226322]: 2026-01-26 10:18:55.146 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:55 np0005595445 nova_compute[226322]: 2026-01-26 10:18:55.412 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:18:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:56.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:56.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:18:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:18:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:18:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:18:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:18:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:18:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:18:58.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:18:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:18:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:18:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:18:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1448684425' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:18:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:18:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:18:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:18:58.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:00.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:00 np0005595445 nova_compute[226322]: 2026-01-26 10:19:00.149 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:00 np0005595445 nova_compute[226322]: 2026-01-26 10:19:00.414 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:02.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:02 np0005595445 podman[239159]: 2026-01-26 10:19:02.275109933 +0000 UTC m=+0.049689178 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 26 05:19:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:02.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:04.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:04.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:05 np0005595445 nova_compute[226322]: 2026-01-26 10:19:05.175 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:19:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:19:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:19:05 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:19:05 np0005595445 nova_compute[226322]: 2026-01-26 10:19:05.415 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:06.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:08.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:10.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:10 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:19:10 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:19:10 np0005595445 nova_compute[226322]: 2026-01-26 10:19:10.176 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:10 np0005595445 nova_compute[226322]: 2026-01-26 10:19:10.417 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:10.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:12.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:12.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:14.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:14.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.548622) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754548673, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1162, "num_deletes": 250, "total_data_size": 2809440, "memory_usage": 2841704, "flush_reason": "Manual Compaction"}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754565155, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1805113, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33626, "largest_seqno": 34783, "table_properties": {"data_size": 1799984, "index_size": 2589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 10304, "raw_average_key_size": 18, "raw_value_size": 1789742, "raw_average_value_size": 3156, "num_data_blocks": 114, "num_entries": 567, "num_filter_entries": 567, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422665, "oldest_key_time": 1769422665, "file_creation_time": 1769422754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 16591 microseconds, and 7245 cpu microseconds.
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.565213) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1805113 bytes OK
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.565243) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567775) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567797) EVENT_LOG_v1 {"time_micros": 1769422754567790, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.567837) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 2803804, prev total WAL file size 2803804, number of live WAL files 2.
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.568961) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323532' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1762KB)], [63(13MB)]
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754569024, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 16197151, "oldest_snapshot_seqno": -1}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6280 keys, 14950493 bytes, temperature: kUnknown
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754703944, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14950493, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14908118, "index_size": 25561, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15749, "raw_key_size": 162279, "raw_average_key_size": 25, "raw_value_size": 14794509, "raw_average_value_size": 2355, "num_data_blocks": 1015, "num_entries": 6280, "num_filter_entries": 6280, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.704272) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14950493 bytes
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.722273) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.0 rd, 110.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.7 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(17.3) write-amplify(8.3) OK, records in: 6794, records dropped: 514 output_compression: NoCompression
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.722322) EVENT_LOG_v1 {"time_micros": 1769422754722303, "job": 38, "event": "compaction_finished", "compaction_time_micros": 135026, "compaction_time_cpu_micros": 41453, "output_level": 6, "num_output_files": 1, "total_output_size": 14950493, "num_input_records": 6794, "num_output_records": 6280, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754723177, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422754727852, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.568869) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727905) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727908) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:14 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:19:14.727913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:19:15 np0005595445 nova_compute[226322]: 2026-01-26 10:19:15.264 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:15 np0005595445 nova_compute[226322]: 2026-01-26 10:19:15.418 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:16.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:16 np0005595445 podman[239318]: 2026-01-26 10:19:16.316496945 +0000 UTC m=+0.098936904 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 26 05:19:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:16.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:18.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:18.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:20.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:20 np0005595445 nova_compute[226322]: 2026-01-26 10:19:20.265 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:20 np0005595445 nova_compute[226322]: 2026-01-26 10:19:20.420 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:20.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:22.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:22.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:23 np0005595445 systemd-logind[783]: New session 55 of user zuul.
Jan 26 05:19:23 np0005595445 systemd[1]: Started Session 55 of User zuul.
Jan 26 05:19:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:24.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:24.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:25 np0005595445 nova_compute[226322]: 2026-01-26 10:19:25.313 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:25 np0005595445 nova_compute[226322]: 2026-01-26 10:19:25.422 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:26.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:26.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:26 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 05:19:26 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4108331622' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 05:19:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:28.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:28.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:19:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:30.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:19:30 np0005595445 nova_compute[226322]: 2026-01-26 10:19:30.315 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:30 np0005595445 nova_compute[226322]: 2026-01-26 10:19:30.423 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:30.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:31 np0005595445 ovs-vsctl[239748]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 05:19:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:32.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:32.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:32 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 05:19:32 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 05:19:32 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 05:19:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:33 np0005595445 podman[239967]: 2026-01-26 10:19:33.28656198 +0000 UTC m=+0.067347138 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:19:33 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: cache status {prefix=cache status} (starting...)
Jan 26 05:19:33 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:33 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: client ls {prefix=client ls} (starting...)
Jan 26 05:19:33 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:33 np0005595445 lvm[240120]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 05:19:33 np0005595445 lvm[240120]: VG ceph_vg0 finished
Jan 26 05:19:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:19:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:34.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: damage ls {prefix=damage ls} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump loads {prefix=dump loads} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 26 05:19:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1630855317' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:34.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:34 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 26 05:19:34 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1100227087' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 26 05:19:34 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2569982764' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:35 np0005595445 nova_compute[226322]: 2026-01-26 10:19:35.318 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:35 np0005595445 nova_compute[226322]: 2026-01-26 10:19:35.426 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: ops {prefix=ops} (starting...)
Jan 26 05:19:35 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2035422281' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/109052100' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 05:19:35 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2878709684' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 05:19:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:36.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:36 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: session ls {prefix=session ls} (starting...)
Jan 26 05:19:36 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:19:36 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: status {prefix=status} (starting...)
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3164859405' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 05:19:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:36.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1762731804' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 26 05:19:36 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1146455129' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3257461002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/130987033' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/655899828' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 26 05:19:37 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3972199707' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 05:19:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.012 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.013 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.013 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:38.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4231684812' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2147534676' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/325511667' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 05:19:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:19:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:38.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 05:19:38 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3812039934' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:38 np0005595445 nova_compute[226322]: 2026-01-26 10:19:38.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4000464440' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1848180643' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 05:19:39 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/136388260' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 737280 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 729088 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927696 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 729088 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73605120 unmapped: 720896 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172495400 session 0x55d16feee000
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 712704 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 712704 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927696 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 704512 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.554183006s of 12.558165550s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 696320 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927105 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 688128 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 688128 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 679936 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 679936 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 671744 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927105 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73662464 unmapped: 663552 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73662464 unmapped: 663552 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.911679268s of 10.922836304s, submitted: 3
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 655360 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 647168 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 647168 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 638976 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 630784 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 622592 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 622592 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 614400 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 606208 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d172658b40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 598016 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 589824 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 581632 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 581632 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 573440 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 565248 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 565248 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 557056 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 540672 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 540672 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930129 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.089416504s of 32.094387054s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 532480 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 524288 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 524288 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 516096 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 507904 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73818112 unmapped: 507904 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d172e80780
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 499712 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 491520 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 491520 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 483328 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 475136 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 466944 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 466944 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 450560 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 442368 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 442368 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 434176 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 425984 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 425984 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 417792 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 409600 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 401408 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73932800 unmapped: 393216 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928947 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 385024 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 376832 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 41.662242889s of 41.667964935s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 376832 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 368640 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 368640 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 360448 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 352256 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73981952 unmapped: 344064 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 335872 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 327680 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 319488 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 303104 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74022912 unmapped: 303104 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74031104 unmapped: 294912 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 278528 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74055680 unmapped: 270336 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74063872 unmapped: 262144 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74072064 unmapped: 253952 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74080256 unmapped: 245760 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172efef00
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74088448 unmapped: 237568 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74096640 unmapped: 229376 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74096640 unmapped: 229376 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74104832 unmapped: 221184 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74113024 unmapped: 212992 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74121216 unmapped: 204800 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74129408 unmapped: 196608 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74137600 unmapped: 188416 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74145792 unmapped: 180224 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 172032 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 163840 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930459 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74170368 unmapped: 155648 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 60.023979187s of 60.028354645s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74178560 unmapped: 147456 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74186752 unmapped: 139264 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74194944 unmapped: 131072 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74203136 unmapped: 122880 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74211328 unmapped: 114688 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 106496 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74227712 unmapped: 98304 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74235904 unmapped: 90112 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74244096 unmapped: 81920 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74252288 unmapped: 73728 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6316 writes, 26K keys, 6316 commit groups, 1.0 writes per commit group, ingest: 19.65 MB, 0.03 MB/s#012Interval WAL: 6316 writes, 1069 syncs, 5.91 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slo
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 16384 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 8192 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 8192 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 0 heap: 74326016 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 1040384 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 1032192 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1024000 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74350592 unmapped: 1024000 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 1015808 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74366976 unmapped: 1007616 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 999424 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74383360 unmapped: 991232 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 983040 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74399744 unmapped: 974848 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74407936 unmapped: 966656 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74416128 unmapped: 958464 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 950272 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74432512 unmapped: 942080 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 933888 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74448896 unmapped: 925696 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 917504 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74465280 unmapped: 909312 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 901120 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 901120 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 892928 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74489856 unmapped: 884736 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74498048 unmapped: 876544 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74506240 unmapped: 868352 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d17293ba40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74514432 unmapped: 860160 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74514432 unmapped: 860160 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74522624 unmapped: 851968 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74539008 unmapped: 835584 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74547200 unmapped: 827392 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 819200 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74563584 unmapped: 811008 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 794624 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 786432 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 778240 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929868 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 114.382240295s of 114.386558533s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 761856 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 745472 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 737280 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 737280 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 720896 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74670080 unmapped: 704512 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74670080 unmapped: 704512 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 696320 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 688128 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 688128 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 679936 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74702848 unmapped: 671744 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74702848 unmapped: 671744 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74711040 unmapped: 663552 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74711040 unmapped: 663552 heap: 75374592 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.977069855s of 23.989372253s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 74809344 unmapped: 1613824 heap: 76423168 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76980224 unmapped: 491520 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76988416 unmapped: 483328 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d1725c52c0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 76996608 unmapped: 475136 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77004800 unmapped: 466944 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931380 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.814674377s of 33.227375031s, submitted: 154
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932892 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77012992 unmapped: 458752 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77094912 unmapped: 376832 heap: 77471744 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 1327104 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 1327104 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 1318912 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 1310720 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 1302528 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 1294336 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 1286144 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 1277952 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 1269760 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 1261568 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 1261568 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d17019eb40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172164400 session 0x55d172e48d20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77275136 unmapped: 1245184 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932301 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 92.695404053s of 93.076416016s, submitted: 88
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933813 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77291520 unmapped: 1228800 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933813 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 1212416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 1196032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77340672 unmapped: 1179648 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1731454a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77348864 unmapped: 1171456 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933222 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1146880 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 69.010238647s of 69.024299622s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934734 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77381632 unmapped: 1138688 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77398016 unmapped: 1122304 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d1703c43c0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77406208 unmapped: 1114112 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 934143 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77422592 unmapped: 1097728 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 45.224506378s of 45.233642578s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935655 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 1064960 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 937167 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 1056768 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17311cd20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 1040384 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 1032192 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 1015808 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936576 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 991232 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 32.827796936s of 32.840641022s, submitted: 3
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172162400 session 0x55d170046780
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77545472 unmapped: 974848 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77586432 unmapped: 933888 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d170b6b400 session 0x55d1721d3a40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 74.881858826s of 74.889221191s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77643776 unmapped: 876544 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1720950e0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.996082306s of 18.000623703s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.481588364s of 14.488451958s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77701120 unmapped: 819200 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939339 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172095c20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d17260d0e0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.857223511s of 56.950168610s, submitted: 4
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d172e81860
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s#012Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 58.666145325s of 58.674335480s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17260c1e0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d16feeed20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.543655396s of 56.563098907s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.324966431s of 10.332220078s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.846134186s of 36.849975586s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78290944 unmapped: 229376 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 37.952041626s of 38.464164734s, submitted: 139
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78381056 unmapped: 1187840 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 52.869400024s of 53.368377686s, submitted: 86
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.398575783s of 12.402492523s, submitted: 1
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17254c000 session 0x55d17293a5a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.989221573s of 20.182910919s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 945456 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d16feee1e0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 79.554496765s of 79.564147949s, submitted: 3
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 79724544 unmapped: 16629760 heap: 96354304 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 146 ms_handle_reset con 0x55d171202800 session 0x55d17263d4a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 146 heartbeat osd_stat(store_statfs(0x4fb58a000/0x0/0x4ffc00000, data 0x15d8375/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 147 ms_handle_reset con 0x55d17042ac00 session 0x55d172584d20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1101069 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 147 heartbeat osd_stat(store_statfs(0x4fb585000/0x0/0x4ffc00000, data 0x15da4a0/0x1695000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.111835480s of 40.450466156s, submitted: 51
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1104407 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102976 data_alloc: 218103808 data_used: 176128
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d172151000 session 0x55d172095e00
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d17035d000 session 0x55d172672960
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 12337152 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857666016s of 12.088842392s, submitted: 2
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb57c000/0x0/0x4ffc00000, data 0x15e06b1/0x169f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d17019b680
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d171202800 session 0x55d172671680
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17254c000 session 0x55d172671860
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167114 data_alloc: 234881024 data_used: 11649024
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d172153400 session 0x55d172671a40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17035d000 session 0x55d172671c20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb234000/0x0/0x4ffc00000, data 0x19286b1/0x19e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d1726714a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170865 data_alloc: 234881024 data_used: 11649024
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94191616 unmapped: 10559488 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d170188f00
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97239040 unmapped: 7512064 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.199676514s of 19.449176788s, submitted: 39
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102621184 unmapped: 4235264 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102588416 unmapped: 4268032 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa917000/0x0/0x4ffc00000, data 0x2245683/0x2305000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274279 data_alloc: 234881024 data_used: 15114240
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa876000/0x0/0x4ffc00000, data 0x22e6683/0x23a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272359 data_alloc: 234881024 data_used: 15114240
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa852000/0x0/0x4ffc00000, data 0x230a683/0x23ca000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.935168266s of 12.713699341s, submitted: 84
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272375 data_alloc: 234881024 data_used: 15114240
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271784 data_alloc: 234881024 data_used: 15114240
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172672d20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172994000 session 0x55d172896780
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17019e000
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102088704 unmapped: 4767744 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1725c54a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273484 data_alloc: 234881024 data_used: 15114240
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1725c5680
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103366656 unmapped: 3489792 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266eb40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.664448738s of 11.864383698s, submitted: 7
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103407616 unmapped: 9748480 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8cd20
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d1726721e0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d17034ef00
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1703c14a0
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17289b680
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318619 data_alloc: 234881024 data_used: 15642624
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172661400 session 0x55d16f9e6960
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 8486912 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355120 data_alloc: 234881024 data_used: 20631552
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17019ab40
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:39 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360592 data_alloc: 234881024 data_used: 21483520
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.041341782s of 14.129011154s, submitted: 19
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 4947968 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 4915200 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360332 data_alloc: 234881024 data_used: 21483520
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109010944 unmapped: 4145152 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 2949120 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 5578752 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113344512 unmapped: 4759552 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1447918 data_alloc: 234881024 data_used: 22425600
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113582080 unmapped: 4521984 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.826783180s of 12.373358727s, submitted: 98
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442638 data_alloc: 234881024 data_used: 22425600
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83ea000/0x0/0x4ffc00000, data 0x31c2683/0x3282000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17289a000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1703c6d20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1730cab40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370749 data_alloc: 234881024 data_used: 18501632
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d170d8c1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172e48000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f905a000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285633 data_alloc: 234881024 data_used: 15642624
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.467782021s of 12.590607643s, submitted: 41
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba1860
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109109248 unmapped: 8994816 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9283000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,1])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 11517952 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1728970e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.436046600s of 11.544039726s, submitted: 32
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d172ba0780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172ba1a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172ba1680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172ba14a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba0d20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1701881e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.435865402s of 23.439153671s, submitted: 1
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170045 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170188f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170d8c960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170d8c000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d170d8c5a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993000 session 0x55d16f9e6960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d16f9e7a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d16f9e6000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193785 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d16f9e65a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ee6b40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106422272 unmapped: 13852672 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.067176819s of 19.197723389s, submitted: 13
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10002432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271222 data_alloc: 234881024 data_used: 15110144
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109248512 unmapped: 11026432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9553000/0x0/0x4ffc00000, data 0x2052693/0x2113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109895680 unmapped: 10379264 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096400 session 0x55d17263d2c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.033960342s of 30.179452896s, submitted: 56
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1730ca5a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 10371072 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172670000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172effe00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1726e9680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172b9e1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172e81c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.123188019s of 25.234869003s, submitted: 31
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 116154368 unmapped: 21454848 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172672780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172896f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d170400d20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172997800 session 0x55d170400000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1703e32c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170047680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.038774490s of 20.122617722s, submitted: 17
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 17965056 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1445412 data_alloc: 251658240 data_used: 28798976
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 15523840 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122789888 unmapped: 14819328 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a3b000/0x0/0x4ffc00000, data 0x2b71683/0x2c31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s#012Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d173144f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172037a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730ca5a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730caf00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.439634323s of 18.733009338s, submitted: 55
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172896960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172ee7c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d170188960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172ee6b40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1476366 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259ab40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1481927 data_alloc: 251658240 data_used: 30072832
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 15736832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494239 data_alloc: 251658240 data_used: 31916032
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123510784 unmapped: 14098432 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.753458977s of 13.886870384s, submitted: 41
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87c8000/0x0/0x4ffc00000, data 0x2de1708/0x2ea3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494831 data_alloc: 251658240 data_used: 31920128
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131710976 unmapped: 5898240 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 133734400 unmapped: 3874816 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7899000/0x0/0x4ffc00000, data 0x3d11708/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622323 data_alloc: 251658240 data_used: 33873920
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.394704819s of 10.849593163s, submitted: 125
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132079616 unmapped: 5529600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720374a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f786f000/0x0/0x4ffc00000, data 0x3d3b708/0x3dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1703c65a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b826a6/0x2c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170047c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17259b680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d172671860
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b82683/0x2c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.995807648s of 35.219715118s, submitted: 71
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d2000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170401c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170c97a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c54a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1703c4780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1726e8960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 22233088 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.421459198s of 19.482046127s, submitted: 29
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117194752 unmapped: 20414464 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 18587648 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 17473536 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.392076492s of 32.704330444s, submitted: 79
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 18792448 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fde00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fc3c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721fd680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266f680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fa40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266e960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266f0e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266e000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266e780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302588 data_alloc: 234881024 data_used: 13058048
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.978124619s of 10.595539093s, submitted: 169
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fc20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1304100 data_alloc: 234881024 data_used: 13041664
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266f4a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266e1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d173145e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303553 data_alloc: 234881024 data_used: 13041664
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314801 data_alloc: 234881024 data_used: 14536704
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d172659680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.355880737s of 25.392654419s, submitted: 18
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2c000/0x0/0x4ffc00000, data 0x2b76708/0x2c38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172162400 session 0x55d1730ca1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409087 data_alloc: 234881024 data_used: 17907712
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c0c708/0x2cce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1,1])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122634240 unmapped: 22323200 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 22102016 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 22085632 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 22200320 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419099 data_alloc: 234881024 data_used: 17899520
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 22134784 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.809408665s of 10.088632584s, submitted: 133
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122904576 unmapped: 22052864 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122912768 unmapped: 22044672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f88f9000/0x0/0x4ffc00000, data 0x2cb1708/0x2d73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420491 data_alloc: 234881024 data_used: 17899520
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122929152 unmapped: 22028288 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721d2000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c61e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122937344 unmapped: 22020096 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259b680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273109 data_alloc: 234881024 data_used: 13041664
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.456459999s of 13.937705994s, submitted: 67
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1703c6960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d1725841e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272745 data_alloc: 234881024 data_used: 13041664
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1726701e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 46.250705719s of 46.323696136s, submitted: 22
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c4f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170d8dc20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d17263c780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d170d8d2c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172895860
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224861 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d17263d680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 25133056 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d170046780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721fc780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118849536 unmapped: 26107904 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 26222592 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1229754 data_alloc: 234881024 data_used: 12795904
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720954a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.380515099s of 11.418202400s, submitted: 13
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17263c960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266fe00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: mgrc ms_handle_reset ms_handle_reset con 0x55d171202c00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2891176105
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2891176105,v1:192.168.122.100:6801/2891176105]
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: mgrc handle_mgr_configure stats_period=5
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172167c00 session 0x55d1725c45a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172eff4a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.299061775s of 15.430803299s, submitted: 25
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172ee72c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1721fc780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c5e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1728943c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172895860
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1703c70e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17259af00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118349824 unmapped: 36126720 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d17259a000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726701e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955a000/0x0/0x4ffc00000, data 0x20506b6/0x2112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117383168 unmapped: 37093376 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301098 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117170176 unmapped: 37306368 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123084800 unmapped: 31391744 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1703c14a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17293a780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.107300758s of 10.256335258s, submitted: 30
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 36560896 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d16f7d4000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d17266fe00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173145c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173144960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172659680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.658174515s of 30.887229919s, submitted: 38
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1731441e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172095c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d172897680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d170c97a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172894d20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293569 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172894b40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172896960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172095860
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 38543360 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172ee70e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f80683/0x2040000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320800 data_alloc: 234881024 data_used: 15335424
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.557689667s of 17.637289047s, submitted: 14
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 32169984 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124157952 unmapped: 31940608 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f919b000/0x0/0x4ffc00000, data 0x240f6b6/0x24d1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 31776768 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124329984 unmapped: 31768576 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417424 data_alloc: 234881024 data_used: 21848064
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9169000/0x0/0x4ffc00000, data 0x24406b6/0x2502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420008 data_alloc: 234881024 data_used: 22011904
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172036f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172585e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9167000/0x0/0x4ffc00000, data 0x24436b6/0x2505000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.894859314s of 12.680984497s, submitted: 52
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e80b40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.971050262s of 24.083293915s, submitted: 28
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae9000 session 0x55d17311c1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172897a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172095c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726e83c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172efed20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d173144960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d1725845a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 46211072 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d3a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e81e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118374400 unmapped: 45596672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118382592 unmapped: 45588480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123191296 unmapped: 40779776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128458752 unmapped: 35512320 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128548864 unmapped: 35422208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae8c00 session 0x55d170d8d0e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8dc20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.590911865s of 20.736030579s, submitted: 31
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170401e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172673680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172672f00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d1726730e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172c29800 session 0x55d17034e1e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172672d20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d16f9e63c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137723904 unmapped: 26247168 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1635802 data_alloc: 251658240 data_used: 29327360
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e20000/0x0/0x4ffc00000, data 0x3781705/0x3844000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 25903104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d17259b680
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 136216576 unmapped: 27754496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1632727 data_alloc: 251658240 data_used: 29339648
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137707520 unmapped: 26263552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.407573700s of 17.746696472s, submitted: 131
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 140615680 unmapped: 23355392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143704064 unmapped: 20267008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f737e000/0x0/0x4ffc00000, data 0x4223705/0x42e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143548416 unmapped: 20422656 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:40.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1784443 data_alloc: 251658240 data_used: 35086336
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1785315 data_alloc: 251658240 data_used: 35086336
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786531 data_alloc: 251658240 data_used: 35164160
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.249229431s of 14.751939774s, submitted: 97
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 142999552 unmapped: 20971520 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17118a400 session 0x55d16f7d4780
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786215 data_alloc: 251658240 data_used: 35164160
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1704003c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7f93000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1579255 data_alloc: 234881024 data_used: 26243072
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.313569069s of 11.179004669s, submitted: 65
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172ee61e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d17311c000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f81e3000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137388032 unmapped: 26583040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 38322176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275770 data_alloc: 234881024 data_used: 12288000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [0,0,1])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172eff4a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172efe3c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1726725a0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d1726590e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172659a40
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.249382019s of 28.619909286s, submitted: 55
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1371866 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e80960
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c3c0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d172585c20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259af00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172894000
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352162 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172673e00
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.371334076s of 18.521051407s, submitted: 16
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f50000/0x0/0x4ffc00000, data 0x22456a6/0x2306000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1444671 data_alloc: 234881024 data_used: 24260608
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e810e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202000 session 0x55d1726701e0
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 32833536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.647443771s of 15.735019684s, submitted: 39
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280707 data_alloc: 234881024 data_used: 12181504
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e26a6/0x16a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17311cd20
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2332631685' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124674048 unmapped: 39297024 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:19:40 np0005595445 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 05:19:40 np0005595445 nova_compute[226322]: 2026-01-26 10:19:40.320 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:40 np0005595445 nova_compute[226322]: 2026-01-26 10:19:40.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:40.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3949276204' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 05:19:40 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/461349660' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 05:19:41 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 05:19:41 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1644447967' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 05:19:41 np0005595445 nova_compute[226322]: 2026-01-26 10:19:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:41 np0005595445 nova_compute[226322]: 2026-01-26 10:19:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:19:41 np0005595445 nova_compute[226322]: 2026-01-26 10:19:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:19:41 np0005595445 nova_compute[226322]: 2026-01-26 10:19:41.711 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:19:41 np0005595445 nova_compute[226322]: 2026-01-26 10:19:41.711 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:42.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4041690340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1604141855' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 05:19:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:42.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 26 05:19:42 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/237639938' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 05:19:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/358209871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1790643785' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/499056126' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 05:19:43 np0005595445 nova_compute[226322]: 2026-01-26 10:19:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:43 np0005595445 nova_compute[226322]: 2026-01-26 10:19:43.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 26 05:19:43 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3776534711' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 05:19:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:44.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2654797244' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4023938435' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 05:19:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:19:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:44.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4005334717' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 26 05:19:44 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1440409755' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 05:19:44 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 05:19:45 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1985737188' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3473383718' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 05:19:45 np0005595445 nova_compute[226322]: 2026-01-26 10:19:45.334 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:45 np0005595445 nova_compute[226322]: 2026-01-26 10:19:45.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3462755018' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2009624782' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 26 05:19:45 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1215151384' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 05:19:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:46.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 26 05:19:46 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/782210037' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 05:19:46 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 26 05:19:46 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1380287593' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 05:19:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:46.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:47 np0005595445 podman[242246]: 2026-01-26 10:19:47.349317859 +0000 UTC m=+0.122305889 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 05:19:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.714 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:19:47 np0005595445 nova_compute[226322]: 2026-01-26 10:19:47.715 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:19:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/578049670' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 05:19:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:48.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2480028327' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.184 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.370 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.372 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4610MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.373 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.373 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/734800606' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.480 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.480 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:19:48 np0005595445 nova_compute[226322]: 2026-01-26 10:19:48.507 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:19:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:48.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/470784433' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:19:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1554868477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:19:49 np0005595445 nova_compute[226322]: 2026-01-26 10:19:49.008 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:19:49 np0005595445 nova_compute[226322]: 2026-01-26 10:19:49.014 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:19:49 np0005595445 nova_compute[226322]: 2026-01-26 10:19:49.028 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:19:49 np0005595445 nova_compute[226322]: 2026-01-26 10:19:49.029 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:19:49 np0005595445 nova_compute[226322]: 2026-01-26 10:19:49.030 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:19:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 26 05:19:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1549787676' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 05:19:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:50.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 05:19:50 np0005595445 nova_compute[226322]: 2026-01-26 10:19:50.334 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:50 np0005595445 nova_compute[226322]: 2026-01-26 10:19:50.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 05:19:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:50.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/42955068' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 05:19:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 05:19:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 26 05:19:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3491128388' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 05:19:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 26 05:19:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/567662507' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 26 05:19:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:52.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 26 05:19:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1194734970' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 26 05:19:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:52.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 26 05:19:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1873884357' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 26 05:19:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 26 05:19:53 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2117799381' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 26 05:19:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:19:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:19:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:19:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:19:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:54.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 26 05:19:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3853829658' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 26 05:19:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:54.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:55 np0005595445 nova_compute[226322]: 2026-01-26 10:19:55.336 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:55 np0005595445 nova_compute[226322]: 2026-01-26 10:19:55.431 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:19:55 np0005595445 ovs-appctl[243877]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 05:19:55 np0005595445 ovs-appctl[243883]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 05:19:55 np0005595445 ovs-appctl[243900]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 05:19:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 26 05:19:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3198355035' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 26 05:19:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 26 05:19:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1046191194' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 26 05:19:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:19:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:56.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:19:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 26 05:19:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2271613595' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 26 05:19:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:19:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 26 05:19:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3463540812' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 26 05:19:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:19:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:19:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:19:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:19:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:19:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:19:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:19:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:19:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:19:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:19:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:19:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3299309898' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1487676632' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 26 05:19:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2011035917' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 26 05:20:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:00.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 26 05:20:00 np0005595445 ceph-mon[80107]:     osd.2 observed slow operation indications in BlueStore
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Jan 26 05:20:00 np0005595445 ceph-mon[80107]:    daemon nfs.cephfs.1.0.compute-2.najyrz on compute-2 is in error state
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107503081' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 26 05:20:00 np0005595445 nova_compute[226322]: 2026-01-26 10:20:00.337 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:00 np0005595445 nova_compute[226322]: 2026-01-26 10:20:00.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 05:20:00 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1400915670' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 05:20:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 26 05:20:01 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3955527617' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 26 05:20:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Jan 26 05:20:01 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171722842' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 26 05:20:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:02.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 26 05:20:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657391317' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 26 05:20:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 26 05:20:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2158557761' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 26 05:20:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 26 05:20:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1688964418' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 26 05:20:03 np0005595445 podman[245646]: 2026-01-26 10:20:03.911657211 +0000 UTC m=+0.091191574 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 05:20:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:04.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 26 05:20:04 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1535477781' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 26 05:20:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:04.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:05 np0005595445 nova_compute[226322]: 2026-01-26 10:20:05.339 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 26 05:20:05 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2906224273' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 26 05:20:05 np0005595445 nova_compute[226322]: 2026-01-26 10:20:05.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:05 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 26 05:20:05 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2459420877' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 26 05:20:06 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 05:20:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:06.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:06.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:06 np0005595445 systemd[1]: Starting Time & Date Service...
Jan 26 05:20:06 np0005595445 systemd[1]: Started Time & Date Service.
Jan 26 05:20:06 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 05:20:06 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/634901022' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 05:20:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 26 05:20:07 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3521053892' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 26 05:20:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:08.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:08.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:08 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 26 05:20:08 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3007392388' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 26 05:20:09 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 26 05:20:09 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4192985596' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 26 05:20:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:10.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:10 np0005595445 nova_compute[226322]: 2026-01-26 10:20:10.340 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:10 np0005595445 nova_compute[226322]: 2026-01-26 10:20:10.436 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:10.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:10 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:10 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:20:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:11 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:20:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:12.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:14.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:14.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:15 np0005595445 nova_compute[226322]: 2026-01-26 10:20:15.392 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:15 np0005595445 nova_compute[226322]: 2026-01-26 10:20:15.437 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:16.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:17 np0005595445 podman[246554]: 2026-01-26 10:20:17.969346557 +0000 UTC m=+0.114674961 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 26 05:20:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:18.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:18.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:20:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:20.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:20 np0005595445 nova_compute[226322]: 2026-01-26 10:20:20.393 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:20 np0005595445 nova_compute[226322]: 2026-01-26 10:20:20.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:20.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:20:21 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2923 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2426 writes, 8573 keys, 2426 commit groups, 1.0 writes per commit group, ingest: 9.03 MB, 0.02 MB/s#012Interval WAL: 2426 writes, 1011 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 05:20:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:22.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:22.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:24.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:24.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:25 np0005595445 nova_compute[226322]: 2026-01-26 10:20:25.434 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:25 np0005595445 nova_compute[226322]: 2026-01-26 10:20:25.439 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:26.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:26.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:28.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:28.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:30 np0005595445 nova_compute[226322]: 2026-01-26 10:20:30.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:30 np0005595445 nova_compute[226322]: 2026-01-26 10:20:30.440 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:30.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:32.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:32.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:20:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:34.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:20:34 np0005595445 podman[246620]: 2026-01-26 10:20:34.291662875 +0000 UTC m=+0.069017402 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:20:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:34.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:35 np0005595445 nova_compute[226322]: 2026-01-26 10:20:35.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:35 np0005595445 nova_compute[226322]: 2026-01-26 10:20:35.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:36.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:36.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:36 np0005595445 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 05:20:36 np0005595445 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 05:20:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:38 np0005595445 nova_compute[226322]: 2026-01-26 10:20:38.030 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:38 np0005595445 nova_compute[226322]: 2026-01-26 10:20:38.031 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:38.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:38.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:38 np0005595445 nova_compute[226322]: 2026-01-26 10:20:38.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:38 np0005595445 nova_compute[226322]: 2026-01-26 10:20:38.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:38 np0005595445 nova_compute[226322]: 2026-01-26 10:20:38.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:20:39 np0005595445 nova_compute[226322]: 2026-01-26 10:20:39.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:40.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:40 np0005595445 nova_compute[226322]: 2026-01-26 10:20:40.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:40 np0005595445 nova_compute[226322]: 2026-01-26 10:20:40.442 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:41 np0005595445 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:41 np0005595445 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:20:41 np0005595445 nova_compute[226322]: 2026-01-26 10:20:41.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:20:41 np0005595445 nova_compute[226322]: 2026-01-26 10:20:41.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:20:41 np0005595445 nova_compute[226322]: 2026-01-26 10:20:41.706 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:42.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:42.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:44.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:44.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:45 np0005595445 nova_compute[226322]: 2026-01-26 10:20:45.440 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:45 np0005595445 nova_compute[226322]: 2026-01-26 10:20:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:46.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.935 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.936 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:20:47 np0005595445 nova_compute[226322]: 2026-01-26 10:20:47.937 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:20:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:48.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:48 np0005595445 podman[246696]: 2026-01-26 10:20:48.30360136 +0000 UTC m=+0.084273882 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 26 05:20:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:20:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/355653575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.456 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.620 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.621 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.622 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.622 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:20:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:48.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.685 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.686 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:20:48 np0005595445 nova_compute[226322]: 2026-01-26 10:20:48.714 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:20:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:20:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1812899865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:20:49 np0005595445 nova_compute[226322]: 2026-01-26 10:20:49.230 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:20:49 np0005595445 nova_compute[226322]: 2026-01-26 10:20:49.237 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:20:49 np0005595445 nova_compute[226322]: 2026-01-26 10:20:49.274 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:20:49 np0005595445 nova_compute[226322]: 2026-01-26 10:20:49.275 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:20:49 np0005595445 nova_compute[226322]: 2026-01-26 10:20:49.275 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:20:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:50.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:50 np0005595445 nova_compute[226322]: 2026-01-26 10:20:50.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:50 np0005595445 nova_compute[226322]: 2026-01-26 10:20:50.443 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:20:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:52.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:52.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.943 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:20:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:20:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:20:53.944 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:20:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:54.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:20:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:54.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:20:55 np0005595445 nova_compute[226322]: 2026-01-26 10:20:55.444 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:20:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:56.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:56.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:20:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:20:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:20:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:20:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:20:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:20:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:20:58.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:20:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:20:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:20:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3073080115' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:20:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:20:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:20:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:20:58.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:20:58 np0005595445 systemd[1]: session-55.scope: Deactivated successfully.
Jan 26 05:20:58 np0005595445 systemd[1]: session-55.scope: Consumed 2min 50.963s CPU time, 640.8M memory peak, read 202.2M from disk, written 65.0M to disk.
Jan 26 05:20:58 np0005595445 systemd-logind[783]: Session 55 logged out. Waiting for processes to exit.
Jan 26 05:20:58 np0005595445 systemd-logind[783]: Removed session 55.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: New session 56 of user zuul.
Jan 26 05:20:59 np0005595445 systemd[1]: Started Session 56 of User zuul.
Jan 26 05:20:59 np0005595445 systemd[1]: session-56.scope: Deactivated successfully.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: Session 56 logged out. Waiting for processes to exit.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: Removed session 56.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: New session 57 of user zuul.
Jan 26 05:20:59 np0005595445 systemd[1]: Started Session 57 of User zuul.
Jan 26 05:20:59 np0005595445 systemd[1]: session-57.scope: Deactivated successfully.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: Session 57 logged out. Waiting for processes to exit.
Jan 26 05:20:59 np0005595445 systemd-logind[783]: Removed session 57.
Jan 26 05:21:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:00.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.446 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.448 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.448 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.449 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.450 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:21:00 np0005595445 nova_compute[226322]: 2026-01-26 10:21:00.453 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:00.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:02.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:02.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:04.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:04.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:05 np0005595445 podman[246816]: 2026-01-26 10:21:05.335178458 +0000 UTC m=+0.101505278 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 05:21:05 np0005595445 nova_compute[226322]: 2026-01-26 10:21:05.450 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:21:05 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6966 writes, 36K keys, 6966 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6966 writes, 6966 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1523 writes, 7934 keys, 1523 commit groups, 1.0 writes per commit group, ingest: 17.68 MB, 0.03 MB/s#012Interval WAL: 1523 writes, 1523 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     87.3      0.61              0.17        19    0.032       0      0       0.0       0.0#012  L6      1/0   14.26 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.4    144.2    123.5      1.90              0.58        18    0.106    100K    10K       0.0       0.0#012 Sum      1/0   14.26 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.4    109.1    114.7      2.52              0.75        37    0.068    100K    10K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.8    109.2    109.8      0.60              0.20         8    0.075     26K   2580       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    144.2    123.5      1.90              0.58        18    0.106    100K    10K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     87.7      0.61              0.17        18    0.034       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.052, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.28 GB write, 0.12 MB/s write, 0.27 GB read, 0.11 MB/s read, 2.5 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55af2cb209b0#2 capacity: 304.00 MB usage: 24.02 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000242 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1466,23.24 MB,7.64475%) FilterBlock(37,289.36 KB,0.0929531%) IndexBlock(37,511.45 KB,0.164298%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 26 05:21:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:06.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:06.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:07.988815) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422867988846, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2079, "num_deletes": 506, "total_data_size": 3966421, "memory_usage": 4016848, "flush_reason": "Manual Compaction"}
Jan 26 05:21:07 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 26 05:21:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868010612, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2581045, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34788, "largest_seqno": 36862, "table_properties": {"data_size": 2571773, "index_size": 4934, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 27614, "raw_average_key_size": 21, "raw_value_size": 2549462, "raw_average_value_size": 1967, "num_data_blocks": 212, "num_entries": 1296, "num_filter_entries": 1296, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422755, "oldest_key_time": 1769422755, "file_creation_time": 1769422867, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 21961 microseconds, and 8091 cpu microseconds.
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.010765) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2581045 bytes OK
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.010795) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.012999) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.013030) EVENT_LOG_v1 {"time_micros": 1769422868013021, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.013058) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 3954998, prev total WAL file size 3954998, number of live WAL files 2.
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.014740) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2520KB)], [66(14MB)]
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868014798, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17531538, "oldest_snapshot_seqno": -1}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6549 keys, 15261766 bytes, temperature: kUnknown
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868156552, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 15261766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15216930, "index_size": 27380, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 172139, "raw_average_key_size": 26, "raw_value_size": 15097796, "raw_average_value_size": 2305, "num_data_blocks": 1082, "num_entries": 6549, "num_filter_entries": 6549, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422868, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.156945) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 15261766 bytes
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.158275) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.5 rd, 107.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 14.3 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(12.7) write-amplify(5.9) OK, records in: 7576, records dropped: 1027 output_compression: NoCompression
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.158303) EVENT_LOG_v1 {"time_micros": 1769422868158291, "job": 40, "event": "compaction_finished", "compaction_time_micros": 141923, "compaction_time_cpu_micros": 53028, "output_level": 6, "num_output_files": 1, "total_output_size": 15261766, "num_input_records": 7576, "num_output_records": 6549, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868159309, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422868164331, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.014638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164392) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:08.164397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:08.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:10.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:10 np0005595445 nova_compute[226322]: 2026-01-26 10:21:10.453 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:10.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:12.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:12.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:14.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:14.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.455 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.457 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.507 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:15 np0005595445 nova_compute[226322]: 2026-01-26 10:21:15.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:21:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:16.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:16.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:18.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:18 np0005595445 podman[246934]: 2026-01-26 10:21:18.590719817 +0000 UTC m=+0.093868606 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 05:21:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:18.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:18 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:19 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:21:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:20.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:20 np0005595445 nova_compute[226322]: 2026-01-26 10:21:20.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:21:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:20.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:20 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:20 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:20 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:21:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:22.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:24.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:25 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:21:25 np0005595445 nova_compute[226322]: 2026-01-26 10:21:25.508 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:25 np0005595445 nova_compute[226322]: 2026-01-26 10:21:25.510 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:26.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:28.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:30.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:30 np0005595445 nova_compute[226322]: 2026-01-26 10:21:30.509 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:30 np0005595445 nova_compute[226322]: 2026-01-26 10:21:30.511 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:30 np0005595445 nova_compute[226322]: 2026-01-26 10:21:30.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:30 np0005595445 nova_compute[226322]: 2026-01-26 10:21:30.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 05:21:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:32.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:32.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 05:21:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:34.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 05:21:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:34.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:35 np0005595445 nova_compute[226322]: 2026-01-26 10:21:35.510 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:35 np0005595445 nova_compute[226322]: 2026-01-26 10:21:35.512 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:36.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:36 np0005595445 podman[247108]: 2026-01-26 10:21:36.295427006 +0000 UTC m=+0.065923956 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 05:21:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:38.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:38 np0005595445 nova_compute[226322]: 2026-01-26 10:21:38.704 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:38 np0005595445 nova_compute[226322]: 2026-01-26 10:21:38.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:39 np0005595445 nova_compute[226322]: 2026-01-26 10:21:39.680 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:39 np0005595445 nova_compute[226322]: 2026-01-26 10:21:39.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:39 np0005595445 nova_compute[226322]: 2026-01-26 10:21:39.685 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.920914) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899920948, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 591, "num_deletes": 250, "total_data_size": 1124047, "memory_usage": 1149016, "flush_reason": "Manual Compaction"}
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899927697, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 590931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36867, "largest_seqno": 37453, "table_properties": {"data_size": 587990, "index_size": 913, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7809, "raw_average_key_size": 21, "raw_value_size": 581968, "raw_average_value_size": 1572, "num_data_blocks": 37, "num_entries": 370, "num_filter_entries": 370, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422869, "oldest_key_time": 1769422869, "file_creation_time": 1769422899, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 6885 microseconds, and 2567 cpu microseconds.
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.927794) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 590931 bytes OK
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.927816) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930791) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930805) EVENT_LOG_v1 {"time_micros": 1769422899930801, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.930822) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1120680, prev total WAL file size 1120680, number of live WAL files 2.
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.931505) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323531' seq:0, type:0; will stop at (end)
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(577KB)], [69(14MB)]
Jan 26 05:21:39 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422899931553, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15852697, "oldest_snapshot_seqno": -1}
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6411 keys, 11851886 bytes, temperature: kUnknown
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900035481, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11851886, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11812376, "index_size": 22376, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16069, "raw_key_size": 169458, "raw_average_key_size": 26, "raw_value_size": 11700025, "raw_average_value_size": 1824, "num_data_blocks": 874, "num_entries": 6411, "num_filter_entries": 6411, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769422899, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.036227) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11851886 bytes
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.039425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.9 rd, 113.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 14.6 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(46.9) write-amplify(20.1) OK, records in: 6919, records dropped: 508 output_compression: NoCompression
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.039469) EVENT_LOG_v1 {"time_micros": 1769422900039453, "job": 42, "event": "compaction_finished", "compaction_time_micros": 104367, "compaction_time_cpu_micros": 32261, "output_level": 6, "num_output_files": 1, "total_output_size": 11851886, "num_input_records": 6919, "num_output_records": 6411, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900039810, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769422900043767, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:39.931418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:21:40.043960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:21:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:40.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:40 np0005595445 nova_compute[226322]: 2026-01-26 10:21:40.513 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:40.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.716 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.717 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 05:21:41 np0005595445 nova_compute[226322]: 2026-01-26 10:21:41.747 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 05:21:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:42 np0005595445 nova_compute[226322]: 2026-01-26 10:21:42.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:43 np0005595445 nova_compute[226322]: 2026-01-26 10:21:43.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:44.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:45 np0005595445 nova_compute[226322]: 2026-01-26 10:21:45.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:46.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:46 np0005595445 nova_compute[226322]: 2026-01-26 10:21:46.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:46 np0005595445 nova_compute[226322]: 2026-01-26 10:21:46.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:46.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.704 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.733 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:21:47 np0005595445 nova_compute[226322]: 2026-01-26 10:21:47.734 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:21:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:21:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1170456353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.174 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:21:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:48.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.335 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.336 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4831MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.336 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.337 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.581 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.582 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:21:48 np0005595445 nova_compute[226322]: 2026-01-26 10:21:48.644 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:21:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:48.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:21:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/737420374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:21:49 np0005595445 nova_compute[226322]: 2026-01-26 10:21:49.069 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:21:49 np0005595445 nova_compute[226322]: 2026-01-26 10:21:49.076 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:21:49 np0005595445 nova_compute[226322]: 2026-01-26 10:21:49.120 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:21:49 np0005595445 nova_compute[226322]: 2026-01-26 10:21:49.121 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:21:49 np0005595445 nova_compute[226322]: 2026-01-26 10:21:49.122 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:21:49 np0005595445 podman[247206]: 2026-01-26 10:21:49.341645109 +0000 UTC m=+0.120651966 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 05:21:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:21:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:50.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:21:50 np0005595445 nova_compute[226322]: 2026-01-26 10:21:50.515 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:50.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:52.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:52.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.945 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:21:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.946 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:21:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:21:53.946 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:21:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:54.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:54.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:55 np0005595445 nova_compute[226322]: 2026-01-26 10:21:55.519 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:21:56 np0005595445 nova_compute[226322]: 2026-01-26 10:21:56.064 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:21:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:21:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:56.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:21:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:21:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:21:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:21:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:21:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:21:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:21:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:21:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:21:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:21:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3956459805' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:21:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:21:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:21:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:21:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:21:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:21:58.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:00.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:00 np0005595445 nova_compute[226322]: 2026-01-26 10:22:00.518 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:00 np0005595445 nova_compute[226322]: 2026-01-26 10:22:00.521 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:00.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:02.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:04.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:04.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:05 np0005595445 nova_compute[226322]: 2026-01-26 10:22:05.520 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:06.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:06.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:06 np0005595445 podman[247268]: 2026-01-26 10:22:06.861692597 +0000 UTC m=+0.082441463 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:22:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:08.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:10.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:10 np0005595445 nova_compute[226322]: 2026-01-26 10:22:10.522 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:12.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:14.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:14.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:15 np0005595445 nova_compute[226322]: 2026-01-26 10:22:15.524 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:22:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:16.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:18.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:18.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 26 05:22:19 np0005595445 radosgw[82065]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Jan 26 05:22:20 np0005595445 podman[247296]: 2026-01-26 10:22:20.321747001 +0000 UTC m=+0.094677174 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 05:22:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:20.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:20 np0005595445 nova_compute[226322]: 2026-01-26 10:22:20.525 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:20.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:22.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:22.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:24.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:25 np0005595445 nova_compute[226322]: 2026-01-26 10:22:25.527 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:25 np0005595445 nova_compute[226322]: 2026-01-26 10:22:25.529 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:26.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:22:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:22:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:22:26 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:22:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:26.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:28.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:28.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:30.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:30 np0005595445 nova_compute[226322]: 2026-01-26 10:22:30.529 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:30.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:32.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:32.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:22:32 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:22:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:34.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:35 np0005595445 nova_compute[226322]: 2026-01-26 10:22:35.530 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:36.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:36.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:37 np0005595445 podman[247465]: 2026-01-26 10:22:37.277962793 +0000 UTC m=+0.056687966 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 05:22:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:38.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:38.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:40.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:40 np0005595445 nova_compute[226322]: 2026-01-26 10:22:40.532 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:40 np0005595445 nova_compute[226322]: 2026-01-26 10:22:40.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:40 np0005595445 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:40 np0005595445 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:40 np0005595445 nova_compute[226322]: 2026-01-26 10:22:40.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:22:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:40.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:41 np0005595445 nova_compute[226322]: 2026-01-26 10:22:41.682 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:42.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:42 np0005595445 nova_compute[226322]: 2026-01-26 10:22:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:42 np0005595445 nova_compute[226322]: 2026-01-26 10:22:42.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:22:42 np0005595445 nova_compute[226322]: 2026-01-26 10:22:42.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:22:42 np0005595445 nova_compute[226322]: 2026-01-26 10:22:42.704 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:22:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:22:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:22:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:43 np0005595445 nova_compute[226322]: 2026-01-26 10:22:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:43 np0005595445 nova_compute[226322]: 2026-01-26 10:22:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:44.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.533 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.535 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.536 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:22:45 np0005595445 nova_compute[226322]: 2026-01-26 10:22:45.537 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:22:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:46.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:22:46 np0005595445 nova_compute[226322]: 2026-01-26 10:22:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:46.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:48.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:48.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.743 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.744 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.744 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.745 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:22:49 np0005595445 nova_compute[226322]: 2026-01-26 10:22:49.745 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:22:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:22:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1151407283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.227 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:22:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:50.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.430 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.432 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.432 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.433 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.516 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.516 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.536 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.587 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.690 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.691 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.713 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.733 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:22:50 np0005595445 nova_compute[226322]: 2026-01-26 10:22:50.761 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:22:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:50.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:22:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3104688942' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:22:51 np0005595445 nova_compute[226322]: 2026-01-26 10:22:51.234 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:22:51 np0005595445 nova_compute[226322]: 2026-01-26 10:22:51.240 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:22:51 np0005595445 nova_compute[226322]: 2026-01-26 10:22:51.261 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:22:51 np0005595445 nova_compute[226322]: 2026-01-26 10:22:51.263 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:22:51 np0005595445 nova_compute[226322]: 2026-01-26 10:22:51.263 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:22:51 np0005595445 podman[247558]: 2026-01-26 10:22:51.344841397 +0000 UTC m=+0.114665108 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 26 05:22:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:52.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:52.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.947 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:22:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.947 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:22:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:22:53.948 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:22:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:54.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:54.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:55 np0005595445 nova_compute[226322]: 2026-01-26 10:22:55.539 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:22:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:56.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:56.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:22:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:22:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:22:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:22:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:22:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:22:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:22:58.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:22:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:22:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:22:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:22:58.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:00.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:00 np0005595445 nova_compute[226322]: 2026-01-26 10:23:00.540 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:00.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:02.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:02.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:04.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:04.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:05 np0005595445 nova_compute[226322]: 2026-01-26 10:23:05.542 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:06.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:07 np0005595445 podman[247622]: 2026-01-26 10:23:07.878095378 +0000 UTC m=+0.054256999 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 05:23:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:08.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:08.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:10.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:10 np0005595445 nova_compute[226322]: 2026-01-26 10:23:10.544 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:10.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:12.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:12.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:14.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:14.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:15 np0005595445 nova_compute[226322]: 2026-01-26 10:23:15.546 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:16.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:18.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:20.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:20 np0005595445 nova_compute[226322]: 2026-01-26 10:23:20.548 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:22 np0005595445 podman[247649]: 2026-01-26 10:23:22.386942872 +0000 UTC m=+0.156996386 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:23:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:22.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:22.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:24.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:24.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.550 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.552 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.553 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:25 np0005595445 nova_compute[226322]: 2026-01-26 10:23:25.565 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:23:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:26.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:26.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:28.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:28.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:30.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:30 np0005595445 nova_compute[226322]: 2026-01-26 10:23:30.566 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:30.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:32.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:32.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:23:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:23:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:23:33 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:23:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:34.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:34.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.567 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.568 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.570 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:23:35 np0005595445 nova_compute[226322]: 2026-01-26 10:23:35.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:36.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:36.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:38 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:23:38 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:23:38 np0005595445 podman[247814]: 2026-01-26 10:23:38.212581808 +0000 UTC m=+0.057997054 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:23:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:38.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:38.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:40.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:40 np0005595445 nova_compute[226322]: 2026-01-26 10:23:40.571 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:40.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:42 np0005595445 nova_compute[226322]: 2026-01-26 10:23:42.264 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:42 np0005595445 nova_compute[226322]: 2026-01-26 10:23:42.265 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:42.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:42 np0005595445 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:42 np0005595445 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:42 np0005595445 nova_compute[226322]: 2026-01-26 10:23:42.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:23:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:42.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:43 np0005595445 nova_compute[226322]: 2026-01-26 10:23:43.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:43 np0005595445 nova_compute[226322]: 2026-01-26 10:23:43.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:44.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:44 np0005595445 nova_compute[226322]: 2026-01-26 10:23:44.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:44 np0005595445 nova_compute[226322]: 2026-01-26 10:23:44.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:23:44 np0005595445 nova_compute[226322]: 2026-01-26 10:23:44.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:23:44 np0005595445 nova_compute[226322]: 2026-01-26 10:23:44.725 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:23:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:44.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:45 np0005595445 nova_compute[226322]: 2026-01-26 10:23:45.573 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:23:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:46.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:46.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:47 np0005595445 nova_compute[226322]: 2026-01-26 10:23:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:47 np0005595445 nova_compute[226322]: 2026-01-26 10:23:47.705 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:48.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:48.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:23:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:50.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:23:50 np0005595445 nova_compute[226322]: 2026-01-26 10:23:50.574 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:50.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.722 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.723 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.723 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.724 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:23:51 np0005595445 nova_compute[226322]: 2026-01-26 10:23:51.724 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:23:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:23:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/909490233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.165 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.322 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.324 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.324 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.325 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.421 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.422 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.442 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:23:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:52.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:23:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4236003645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.886 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.891 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.904 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.905 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:23:52 np0005595445 nova_compute[226322]: 2026-01-26 10:23:52.906 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:23:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:52.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:53 np0005595445 podman[247914]: 2026-01-26 10:23:53.388124261 +0000 UTC m=+0.148692755 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 05:23:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.949 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:23:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.949 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:23:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:23:53.950 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:23:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:54.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:54.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:55 np0005595445 nova_compute[226322]: 2026-01-26 10:23:55.576 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:23:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:56.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:56.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:23:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:23:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:23:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:23:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:23:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:23:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:23:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:23:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:23:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:23:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2050018793' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:23:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:23:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:23:58.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:23:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:23:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:23:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:23:58.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:00.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:00 np0005595445 nova_compute[226322]: 2026-01-26 10:24:00.577 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:00 np0005595445 nova_compute[226322]: 2026-01-26 10:24:00.580 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:00.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:02.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:02.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:04.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:04.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:05 np0005595445 nova_compute[226322]: 2026-01-26 10:24:05.579 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:06.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:06.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:08.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:08.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:09 np0005595445 podman[247974]: 2026-01-26 10:24:09.303692413 +0000 UTC m=+0.081111488 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 26 05:24:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:10.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.583 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.585 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:10 np0005595445 nova_compute[226322]: 2026-01-26 10:24:10.629 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:10.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:12.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:12.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:14.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:15 np0005595445 nova_compute[226322]: 2026-01-26 10:24:15.631 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:16.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:18.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:20 np0005595445 nova_compute[226322]: 2026-01-26 10:24:20.632 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:20 np0005595445 nova_compute[226322]: 2026-01-26 10:24:20.634 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:20.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:22.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:22.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:24 np0005595445 podman[248001]: 2026-01-26 10:24:24.337627973 +0000 UTC m=+0.114420272 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 26 05:24:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:24.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:24.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.635 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.637 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:25 np0005595445 nova_compute[226322]: 2026-01-26 10:24:25.677 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:26.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:26.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:28.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:28.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:30.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:30 np0005595445 nova_compute[226322]: 2026-01-26 10:24:30.678 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:30.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:32.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:32.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:34.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:34.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:35 np0005595445 nova_compute[226322]: 2026-01-26 10:24:35.680 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:36.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:36.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:38.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:24:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:38.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:24:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:24:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:24:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:24:39 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:24:40 np0005595445 podman[248143]: 2026-01-26 10:24:40.276515946 +0000 UTC m=+0.051918576 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 26 05:24:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:40.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.685 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.686 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.687 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:24:40 np0005595445 nova_compute[226322]: 2026-01-26 10:24:40.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:40.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.197851) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081197894, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 2015, "num_deletes": 251, "total_data_size": 5255356, "memory_usage": 5321232, "flush_reason": "Manual Compaction"}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081225054, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3423062, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37458, "largest_seqno": 39468, "table_properties": {"data_size": 3414852, "index_size": 5024, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16999, "raw_average_key_size": 20, "raw_value_size": 3398409, "raw_average_value_size": 4036, "num_data_blocks": 218, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769422900, "oldest_key_time": 1769422900, "file_creation_time": 1769423081, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 27368 microseconds, and 9720 cpu microseconds.
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.225190) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3423062 bytes OK
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.225284) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227672) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227705) EVENT_LOG_v1 {"time_micros": 1769423081227689, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.227878) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 5246388, prev total WAL file size 5246388, number of live WAL files 2.
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.229456) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3342KB)], [72(11MB)]
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081229506, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15274948, "oldest_snapshot_seqno": -1}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6737 keys, 13078069 bytes, temperature: kUnknown
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081315383, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13078069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13035403, "index_size": 24698, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 176945, "raw_average_key_size": 26, "raw_value_size": 12916299, "raw_average_value_size": 1917, "num_data_blocks": 970, "num_entries": 6737, "num_filter_entries": 6737, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423081, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.315600) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13078069 bytes
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.317462) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.7 rd, 152.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 11.3 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.3) write-amplify(3.8) OK, records in: 7253, records dropped: 516 output_compression: NoCompression
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.317479) EVENT_LOG_v1 {"time_micros": 1769423081317471, "job": 44, "event": "compaction_finished", "compaction_time_micros": 85938, "compaction_time_cpu_micros": 31860, "output_level": 6, "num_output_files": 1, "total_output_size": 13078069, "num_input_records": 7253, "num_output_records": 6737, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081318046, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423081320210, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.229371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:41 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:24:41.320285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:24:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:42.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:42 np0005595445 nova_compute[226322]: 2026-01-26 10:24:42.907 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:42 np0005595445 nova_compute[226322]: 2026-01-26 10:24:42.908 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:42.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:43 np0005595445 nova_compute[226322]: 2026-01-26 10:24:43.684 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:43 np0005595445 nova_compute[226322]: 2026-01-26 10:24:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:43 np0005595445 nova_compute[226322]: 2026-01-26 10:24:43.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:43 np0005595445 nova_compute[226322]: 2026-01-26 10:24:43.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:24:44 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:24:44 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:24:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:44.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:44 np0005595445 nova_compute[226322]: 2026-01-26 10:24:44.688 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:44 np0005595445 nova_compute[226322]: 2026-01-26 10:24:44.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:24:44 np0005595445 nova_compute[226322]: 2026-01-26 10:24:44.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:24:44 np0005595445 nova_compute[226322]: 2026-01-26 10:24:44.716 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:24:44 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:44 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:44 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:45 np0005595445 nova_compute[226322]: 2026-01-26 10:24:45.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:45 np0005595445 nova_compute[226322]: 2026-01-26 10:24:45.688 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:24:45 np0005595445 nova_compute[226322]: 2026-01-26 10:24:45.792 226326 DEBUG oslo_concurrency.processutils [None req-695a35b5-cbaf-43c3-a71b-bf8c928be5ef c2f0bcfebfa24487b4079cc85d8950ce 3ff3fa2a5531460b993c609589aa545d - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:24:45 np0005595445 nova_compute[226322]: 2026-01-26 10:24:45.822 226326 DEBUG oslo_concurrency.processutils [None req-695a35b5-cbaf-43c3-a71b-bf8c928be5ef c2f0bcfebfa24487b4079cc85d8950ce 3ff3fa2a5531460b993c609589aa545d - - default default] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:24:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:46.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:46 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:46 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:46 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:46.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:48.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:48 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:48 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:48 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:48.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:49 np0005595445 nova_compute[226322]: 2026-01-26 10:24:49.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:50 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:50 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:50 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:50.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:50 np0005595445 nova_compute[226322]: 2026-01-26 10:24:50.691 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:52 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:52 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:52 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:52.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:52 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:52.557 143326 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '02:1d:e1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7e:2d:b7:9f:32:de'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.558 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:52 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:52.559 143326 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.738 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.739 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.740 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:24:52 np0005595445 nova_compute[226322]: 2026-01-26 10:24:52.740 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:24:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:24:53 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3808329144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.239 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.431 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.432 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4848MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.432 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.433 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.493 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.494 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.512 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:24:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.950 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:24:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.951 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:24:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:53.952 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:24:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:24:53 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/11944970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.979 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:24:53 np0005595445 nova_compute[226322]: 2026-01-26 10:24:53.985 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:24:54 np0005595445 nova_compute[226322]: 2026-01-26 10:24:54.003 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:24:54 np0005595445 nova_compute[226322]: 2026-01-26 10:24:54.006 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:24:54 np0005595445 nova_compute[226322]: 2026-01-26 10:24:54.007 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:24:54 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:54 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:54 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:24:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:24:55 np0005595445 podman[248263]: 2026-01-26 10:24:55.376420219 +0000 UTC m=+0.142427271 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:24:55 np0005595445 nova_compute[226322]: 2026-01-26 10:24:55.692 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:55 np0005595445 nova_compute[226322]: 2026-01-26 10:24:55.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:24:56 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:56 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:24:56 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:56.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:24:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:57.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:24:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:24:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:24:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:24:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:24:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:24:58 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:58 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:58 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:24:58.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:24:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:24:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:24:59.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:24:59 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:24:59.561 143326 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=5f259fb6-5896-4c89-8853-1dd537a2ebf7, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 26 05:25:00 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:00 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:00 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:00.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:00 np0005595445 nova_compute[226322]: 2026-01-26 10:25:00.694 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:01.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:02 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:02 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:25:02 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:02.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:25:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:03.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:04 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:04 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:04 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:04.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:25:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:05.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:25:05 np0005595445 nova_compute[226322]: 2026-01-26 10:25:05.695 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:06 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:06 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:06 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:06.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:07.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:08 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:08 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:08 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:08.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:25:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:09.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:25:10 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:10 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:10 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:10.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:10 np0005595445 nova_compute[226322]: 2026-01-26 10:25:10.698 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:11.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:11 np0005595445 podman[248322]: 2026-01-26 10:25:11.295561501 +0000 UTC m=+0.074773034 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 05:25:12 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:12 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:12 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:12.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:13.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:14 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:14 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:14 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:14.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:15.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.700 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.702 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:15 np0005595445 nova_compute[226322]: 2026-01-26 10:25:15.731 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:16 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:16 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:16 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:16.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:17.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:18 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:18 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:18 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:18.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:19.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:20 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:20 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:20 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:20.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:20 np0005595445 nova_compute[226322]: 2026-01-26 10:25:20.732 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:21.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:22 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:22 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:22 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:22.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:23.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:24 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:24 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:24 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:24.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:25.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:25 np0005595445 nova_compute[226322]: 2026-01-26 10:25:25.733 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:26 np0005595445 podman[248349]: 2026-01-26 10:25:26.290860779 +0000 UTC m=+0.071707710 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 05:25:26 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:26 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:26 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:26.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:27.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:28 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:28 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:28 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:28.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:29.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:30 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:30 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:30 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:30.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.735 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.776 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.778 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:30 np0005595445 nova_compute[226322]: 2026-01-26 10:25:30.779 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:31.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:32 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:32 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000028s ======
Jan 26 05:25:32 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:32.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 26 05:25:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:33.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:34 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:34 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:34 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:34.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:35.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:35 np0005595445 nova_compute[226322]: 2026-01-26 10:25:35.780 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:36 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:36 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:36 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:36.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:37.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:38 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:38 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:38 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:38.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:40 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:40 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:40 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:40.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:40 np0005595445 nova_compute[226322]: 2026-01-26 10:25:40.782 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:41.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:42 np0005595445 podman[248409]: 2026-01-26 10:25:42.281935017 +0000 UTC m=+0.058712603 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 26 05:25:42 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:42 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:42 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:42.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:44 np0005595445 nova_compute[226322]: 2026-01-26 10:25:44.007 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:44 np0005595445 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:44 np0005595445 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:44 np0005595445 nova_compute[226322]: 2026-01-26 10:25:44.008 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:44 np0005595445 nova_compute[226322]: 2026-01-26 10:25:44.009 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:25:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:45.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:25:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:45 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:45.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.328 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.330 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.330 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:25:45 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.356 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:25:45 np0005595445 podman[248553]: 2026-01-26 10:25:45.368067396 +0000 UTC m=+0.839612593 container exec 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Jan 26 05:25:45 np0005595445 podman[248553]: 2026-01-26 10:25:45.468615556 +0000 UTC m=+0.940160823 container exec_died 92f0d6d8766ae48472ddf8a153c9b70289899caad4fc617f02a5f998f1b14f40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-crash-compute-1, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.784 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.785 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:45 np0005595445 nova_compute[226322]: 2026-01-26 10:25:45.786 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:46 np0005595445 podman[248691]: 2026-01-26 10:25:46.011788479 +0000 UTC m=+0.058425095 container exec 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:25:46 np0005595445 podman[248691]: 2026-01-26 10:25:46.019876761 +0000 UTC m=+0.066513377 container exec_died 195b20288843353b3582fbff70643eef0adbd6529b6a14aa2b2d24a2fd7f731a (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 26 05:25:46 np0005595445 podman[248762]: 2026-01-26 10:25:46.295457317 +0000 UTC m=+0.066862967 container exec 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 26 05:25:46 np0005595445 podman[248762]: 2026-01-26 10:25:46.334293333 +0000 UTC m=+0.105698943 container exec_died 2481bf60245d7c2e187aaa8c0a64c69d854b50d0db3801811d86e7df60b7c5b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 26 05:25:46 np0005595445 podman[248828]: 2026-01-26 10:25:46.597119219 +0000 UTC m=+0.073168221 container exec 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:25:46 np0005595445 podman[248828]: 2026-01-26 10:25:46.613150338 +0000 UTC m=+0.089199330 container exec_died 0b47113c4bec12a1b959ec116893ec06b51bfdee5d92d974ebc910ad2fb55e29 (image=quay.io/ceph/haproxy:2.3, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-haproxy-nfs-cephfs-compute-1-nsxfyf)
Jan 26 05:25:46 np0005595445 podman[248896]: 2026-01-26 10:25:46.858303259 +0000 UTC m=+0.068703677 container exec 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1793, com.redhat.component=keepalived-container, version=2.2.4, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 26 05:25:46 np0005595445 podman[248896]: 2026-01-26 10:25:46.879185552 +0000 UTC m=+0.089585940 container exec_died 208e9665eacfd07bd22323ccaa6b7ee11548adb807d3aca7b7ff500d25179268 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-keepalived-nfs-cephfs-compute-1-wvnxoh, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, architecture=x86_64, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-type=git, name=keepalived, release=1793)
Jan 26 05:25:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:47.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 26 05:25:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1077740556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:48 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:25:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:49.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:49.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:49 np0005595445 nova_compute[226322]: 2026-01-26 10:25:49.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.788 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.790 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.812 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:50 np0005595445 nova_compute[226322]: 2026-01-26 10:25:50.813 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:25:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:51.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:52 np0005595445 nova_compute[226322]: 2026-01-26 10:25:52.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:25:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:25:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:53.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:25:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:53.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.951 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:25:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:25:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:25:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:25:54 np0005595445 nova_compute[226322]: 2026-01-26 10:25:54.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:25:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:25:55 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2985707705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.149 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:25:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:55.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:55.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.354 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4797MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.355 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.417 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.417 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.440 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.815 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.819 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:25:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:25:55 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2820156879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.972 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.979 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:25:55 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.996 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:25:56 np0005595445 nova_compute[226322]: 2026-01-26 10:25:55.999 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:25:56 np0005595445 nova_compute[226322]: 2026-01-26 10:25:56.000 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:25:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:57.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:57.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:57 np0005595445 podman[249109]: 2026-01-26 10:25:57.363739922 +0000 UTC m=+0.131088670 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 26 05:25:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:25:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:25:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:25:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:25:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:25:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:25:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:25:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:25:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:25:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1247078311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:25:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:25:59.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:25:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:25:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:25:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:25:59.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:00 np0005595445 nova_compute[226322]: 2026-01-26 10:26:00.818 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:01.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:01.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:03.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:03.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:05.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:05.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.820 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.822 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.823 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:05 np0005595445 nova_compute[226322]: 2026-01-26 10:26:05.825 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:26:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:07.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:26:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:07.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:09.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:09.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:10 np0005595445 nova_compute[226322]: 2026-01-26 10:26:10.825 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:11.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:11.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:13 np0005595445 podman[249170]: 2026-01-26 10:26:13.292276403 +0000 UTC m=+0.058013273 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 26 05:26:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:13.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:13.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:15.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:15.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:15 np0005595445 nova_compute[226322]: 2026-01-26 10:26:15.827 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:15 np0005595445 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:15 np0005595445 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:26:15 np0005595445 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:15 np0005595445 nova_compute[226322]: 2026-01-26 10:26:15.828 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:17.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:17.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:19.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:19.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:20 np0005595445 nova_compute[226322]: 2026-01-26 10:26:20.829 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:21.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:21.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:23.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:25.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:25.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:25 np0005595445 nova_compute[226322]: 2026-01-26 10:26:25.830 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:27.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:27.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:28 np0005595445 podman[249221]: 2026-01-26 10:26:28.200717542 +0000 UTC m=+0.125534843 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 05:26:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:29.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:29.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:30 np0005595445 nova_compute[226322]: 2026-01-26 10:26:30.832 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:31.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:31.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:26:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:33.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:26:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:33.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:35.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:35.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:35 np0005595445 nova_compute[226322]: 2026-01-26 10:26:35.835 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:37.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:37.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:39.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:39.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.836 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.837 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.837 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.838 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.838 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:40 np0005595445 nova_compute[226322]: 2026-01-26 10:26:40.839 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:41.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:41.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:42 np0005595445 nova_compute[226322]: 2026-01-26 10:26:42.724 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:43.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:43.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:43 np0005595445 nova_compute[226322]: 2026-01-26 10:26:43.683 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:43 np0005595445 nova_compute[226322]: 2026-01-26 10:26:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:43 np0005595445 nova_compute[226322]: 2026-01-26 10:26:43.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:43 np0005595445 nova_compute[226322]: 2026-01-26 10:26:43.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:26:45 np0005595445 podman[249258]: 2026-01-26 10:26:45.027388598 +0000 UTC m=+0.080894703 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 05:26:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:45.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:45.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.840 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.841 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.842 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:45 np0005595445 nova_compute[226322]: 2026-01-26 10:26:45.843 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:46 np0005595445 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:46 np0005595445 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:26:46 np0005595445 nova_compute[226322]: 2026-01-26 10:26:46.689 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:26:46 np0005595445 nova_compute[226322]: 2026-01-26 10:26:46.712 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:26:46 np0005595445 nova_compute[226322]: 2026-01-26 10:26:46.713 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:47.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:47.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:47 np0005595445 nova_compute[226322]: 2026-01-26 10:26:47.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:49.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:49.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.717 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.843 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.845 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.845 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.846 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.876 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:26:50 np0005595445 nova_compute[226322]: 2026-01-26 10:26:50.877 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:26:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:26:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:51.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:26:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:51.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:26:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:53.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:26:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:53.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.953 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:26:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:26:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:26:53.954 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:26:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:26:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:26:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:26:53 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.104945) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215105015, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1586, "num_deletes": 257, "total_data_size": 3994264, "memory_usage": 4044808, "flush_reason": "Manual Compaction"}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215120693, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2599346, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39473, "largest_seqno": 41054, "table_properties": {"data_size": 2592642, "index_size": 3839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14234, "raw_average_key_size": 19, "raw_value_size": 2579060, "raw_average_value_size": 3612, "num_data_blocks": 165, "num_entries": 714, "num_filter_entries": 714, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423082, "oldest_key_time": 1769423082, "file_creation_time": 1769423215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 15820 microseconds, and 6389 cpu microseconds.
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.120776) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2599346 bytes OK
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.120796) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125544) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125564) EVENT_LOG_v1 {"time_micros": 1769423215125558, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.125582) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3986937, prev total WAL file size 3986937, number of live WAL files 2.
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.126729) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303031' seq:72057594037927935, type:22 .. '6C6F676D0031323534' seq:0, type:0; will stop at (end)
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2538KB)], [75(12MB)]
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215126776, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15677415, "oldest_snapshot_seqno": -1}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6919 keys, 15513175 bytes, temperature: kUnknown
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215200991, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15513175, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15466854, "index_size": 27905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 181700, "raw_average_key_size": 26, "raw_value_size": 15342100, "raw_average_value_size": 2217, "num_data_blocks": 1103, "num_entries": 6919, "num_filter_entries": 6919, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423215, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.201416) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15513175 bytes
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.202999) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.7 rd, 208.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 12.5 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(12.0) write-amplify(6.0) OK, records in: 7451, records dropped: 532 output_compression: NoCompression
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.203029) EVENT_LOG_v1 {"time_micros": 1769423215203016, "job": 46, "event": "compaction_finished", "compaction_time_micros": 74404, "compaction_time_cpu_micros": 34123, "output_level": 6, "num_output_files": 1, "total_output_size": 15513175, "num_input_records": 7451, "num_output_records": 6919, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215204170, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423215208626, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.126576) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:26:55.208828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:26:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:55.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:55.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.717 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.745 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.745 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.746 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:26:55 np0005595445 nova_compute[226322]: 2026-01-26 10:26:55.926 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:26:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:26:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1505490843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.236 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.409 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4843MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.410 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.597 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.598 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:26:56 np0005595445 nova_compute[226322]: 2026-01-26 10:26:56.612 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:26:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:26:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3786943580' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:26:57 np0005595445 nova_compute[226322]: 2026-01-26 10:26:57.061 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:26:57 np0005595445 nova_compute[226322]: 2026-01-26 10:26:57.066 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:26:57 np0005595445 nova_compute[226322]: 2026-01-26 10:26:57.093 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:26:57 np0005595445 nova_compute[226322]: 2026-01-26 10:26:57.095 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:26:57 np0005595445 nova_compute[226322]: 2026-01-26 10:26:57.096 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:26:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:26:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:57.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:26:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:26:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:26:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:26:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:26:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:26:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4254120803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:26:58 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:26:59 np0005595445 podman[249459]: 2026-01-26 10:26:59.311395003 +0000 UTC m=+0.084702646 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 05:26:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:26:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:26:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:26:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:26:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:26:59.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.928 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.930 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.966 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:00 np0005595445 nova_compute[226322]: 2026-01-26 10:27:00.967 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:01.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:01.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:03.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:27:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:03.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:27:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:05.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:05.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:05 np0005595445 nova_compute[226322]: 2026-01-26 10:27:05.967 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:05 np0005595445 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:05 np0005595445 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:05 np0005595445 nova_compute[226322]: 2026-01-26 10:27:05.969 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:06 np0005595445 nova_compute[226322]: 2026-01-26 10:27:06.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:06 np0005595445 nova_compute[226322]: 2026-01-26 10:27:06.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:07.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:07.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:09.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:09.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:11 np0005595445 nova_compute[226322]: 2026-01-26 10:27:11.001 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:11 np0005595445 nova_compute[226322]: 2026-01-26 10:27:11.003 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:11.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:11.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:15 np0005595445 podman[249518]: 2026-01-26 10:27:15.264770851 +0000 UTC m=+0.045116904 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 26 05:27:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:15.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:15.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.004 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.005 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.006 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.044 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:16 np0005595445 nova_compute[226322]: 2026-01-26 10:27:16.044 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:17.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:17.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:19.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:19.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:21 np0005595445 nova_compute[226322]: 2026-01-26 10:27:21.045 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:27:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:21.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:27:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:21.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:23.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:23.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:25.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:25.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.048 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.049 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.049 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.050 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.089 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:26 np0005595445 nova_compute[226322]: 2026-01-26 10:27:26.090 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:27:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:27.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:27:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:27.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:29.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:30 np0005595445 podman[249570]: 2026-01-26 10:27:30.333592658 +0000 UTC m=+0.105664805 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 05:27:31 np0005595445 nova_compute[226322]: 2026-01-26 10:27:31.090 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:31 np0005595445 nova_compute[226322]: 2026-01-26 10:27:31.092 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:31.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:35.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:35.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.093 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.095 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.096 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.096 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.134 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:36 np0005595445 nova_compute[226322]: 2026-01-26 10:27:36.134 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:37.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:37.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:39.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:39.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:41 np0005595445 nova_compute[226322]: 2026-01-26 10:27:41.135 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:41.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:42 np0005595445 nova_compute[226322]: 2026-01-26 10:27:42.697 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:43.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:43.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:43 np0005595445 nova_compute[226322]: 2026-01-26 10:27:43.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:44 np0005595445 nova_compute[226322]: 2026-01-26 10:27:44.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:45.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:45.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:45 np0005595445 nova_compute[226322]: 2026-01-26 10:27:45.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:45 np0005595445 nova_compute[226322]: 2026-01-26 10:27:45.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.137 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.139 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.140 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.140 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.169 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:46 np0005595445 nova_compute[226322]: 2026-01-26 10:27:46.170 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:46 np0005595445 podman[249606]: 2026-01-26 10:27:46.299467852 +0000 UTC m=+0.066775001 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Jan 26 05:27:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:47.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:47.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:47 np0005595445 nova_compute[226322]: 2026-01-26 10:27:47.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:47 np0005595445 nova_compute[226322]: 2026-01-26 10:27:47.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:27:47 np0005595445 nova_compute[226322]: 2026-01-26 10:27:47.688 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:27:47 np0005595445 nova_compute[226322]: 2026-01-26 10:27:47.706 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:27:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:48 np0005595445 nova_compute[226322]: 2026-01-26 10:27:48.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:49.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:49.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:49 np0005595445 nova_compute[226322]: 2026-01-26 10:27:49.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:50 np0005595445 nova_compute[226322]: 2026-01-26 10:27:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:51 np0005595445 nova_compute[226322]: 2026-01-26 10:27:51.171 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:27:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:51.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:27:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:51.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:52 np0005595445 nova_compute[226322]: 2026-01-26 10:27:52.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:53.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:53.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:27:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:27:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:27:53.955 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:27:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:55.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:27:55 np0005595445 nova_compute[226322]: 2026-01-26 10:27:55.710 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:27:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:27:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1747333867' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.161 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.173 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.174 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.209 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.210 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.384 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4852MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.386 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.522 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.522 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.537 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing inventories for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.598 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating ProviderTree inventory for provider d06842a0-5d13-4573-bb78-d433bbb380e4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.599 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Updating inventory in ProviderTree for provider d06842a0-5d13-4573-bb78-d433bbb380e4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.620 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing aggregate associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.643 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Refreshing trait associations for resource provider d06842a0-5d13-4573-bb78-d433bbb380e4, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE4A,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 26 05:27:56 np0005595445 nova_compute[226322]: 2026-01-26 10:27:56.659 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:27:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:27:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4282693921' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:27:57 np0005595445 nova_compute[226322]: 2026-01-26 10:27:57.115 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:27:57 np0005595445 nova_compute[226322]: 2026-01-26 10:27:57.120 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:27:57 np0005595445 nova_compute[226322]: 2026-01-26 10:27:57.139 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:27:57 np0005595445 nova_compute[226322]: 2026-01-26 10:27:57.140 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:27:57 np0005595445 nova_compute[226322]: 2026-01-26 10:27:57.140 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:27:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:57.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:27:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:57.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:27:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:27:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:27:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:27:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:27:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:27:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:27:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:27:59.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:27:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:27:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:27:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:27:59.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:01 np0005595445 nova_compute[226322]: 2026-01-26 10:28:01.244 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:01 np0005595445 podman[249782]: 2026-01-26 10:28:01.341603606 +0000 UTC m=+0.084993225 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 05:28:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:01.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:01.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:02 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:28:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:03 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:28:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:03.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:03.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:05.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:05.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.249 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.252 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.271 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:06 np0005595445 nova_compute[226322]: 2026-01-26 10:28:06.272 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:07 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:07 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:28:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:07.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:07.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:09.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:09.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:11 np0005595445 nova_compute[226322]: 2026-01-26 10:28:11.273 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:12 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.024279) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293024309, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1034, "num_deletes": 251, "total_data_size": 2473634, "memory_usage": 2507120, "flush_reason": "Manual Compaction"}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293039031, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1584837, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41059, "largest_seqno": 42088, "table_properties": {"data_size": 1580168, "index_size": 2257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10334, "raw_average_key_size": 19, "raw_value_size": 1570776, "raw_average_value_size": 3009, "num_data_blocks": 99, "num_entries": 522, "num_filter_entries": 522, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423216, "oldest_key_time": 1769423216, "file_creation_time": 1769423293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 14789 microseconds, and 5192 cpu microseconds.
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.039066) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1584837 bytes OK
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.039085) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040830) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040841) EVENT_LOG_v1 {"time_micros": 1769423293040838, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.040867) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 2468521, prev total WAL file size 2468521, number of live WAL files 2.
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.041605) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1547KB)], [78(14MB)]
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293041693, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17098012, "oldest_snapshot_seqno": -1}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6925 keys, 14808621 bytes, temperature: kUnknown
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293108022, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14808621, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14763363, "index_size": 26842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 182522, "raw_average_key_size": 26, "raw_value_size": 14639581, "raw_average_value_size": 2114, "num_data_blocks": 1053, "num_entries": 6925, "num_filter_entries": 6925, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423293, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.108246) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14808621 bytes
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.110123) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 257.5 rd, 223.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 14.8 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(20.1) write-amplify(9.3) OK, records in: 7441, records dropped: 516 output_compression: NoCompression
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.110138) EVENT_LOG_v1 {"time_micros": 1769423293110131, "job": 48, "event": "compaction_finished", "compaction_time_micros": 66388, "compaction_time_cpu_micros": 27683, "output_level": 6, "num_output_files": 1, "total_output_size": 14808621, "num_input_records": 7441, "num_output_records": 6925, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293110443, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423293113043, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.041510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:28:13.113159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:28:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:13.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:15.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:16 np0005595445 nova_compute[226322]: 2026-01-26 10:28:16.276 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:17 np0005595445 podman[249867]: 2026-01-26 10:28:17.274827747 +0000 UTC m=+0.057933322 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:28:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:17.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:19.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:19.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:21 np0005595445 nova_compute[226322]: 2026-01-26 10:28:21.278 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:21.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:23.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:25.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:25.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:26 np0005595445 nova_compute[226322]: 2026-01-26 10:28:26.280 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:27.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:29.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:29.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:31 np0005595445 nova_compute[226322]: 2026-01-26 10:28:31.282 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:31.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:31.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:32 np0005595445 podman[249920]: 2026-01-26 10:28:32.387012259 +0000 UTC m=+0.157645105 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 05:28:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:33.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:33.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:35.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:35.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.285 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.286 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.287 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.287 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.288 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:36 np0005595445 nova_compute[226322]: 2026-01-26 10:28:36.290 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:37.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:37.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:39.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:39.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.291 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.293 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.304 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:41 np0005595445 nova_compute[226322]: 2026-01-26 10:28:41.304 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:28:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:41.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:41.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:43.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:44 np0005595445 nova_compute[226322]: 2026-01-26 10:28:44.141 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:45.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:45 np0005595445 nova_compute[226322]: 2026-01-26 10:28:45.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:46 np0005595445 nova_compute[226322]: 2026-01-26 10:28:46.305 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:28:46 np0005595445 nova_compute[226322]: 2026-01-26 10:28:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:47.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:47.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.700 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.700 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:47 np0005595445 nova_compute[226322]: 2026-01-26 10:28:47.701 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:28:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:48 np0005595445 podman[249954]: 2026-01-26 10:28:48.265443942 +0000 UTC m=+0.045536864 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 05:28:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:28:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:49.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:28:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:49.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:49 np0005595445 nova_compute[226322]: 2026-01-26 10:28:49.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:50 np0005595445 nova_compute[226322]: 2026-01-26 10:28:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:51 np0005595445 nova_compute[226322]: 2026-01-26 10:28:51.306 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:51.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:51.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:51 np0005595445 nova_compute[226322]: 2026-01-26 10:28:51.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:53.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:53.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.956 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:28:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:28:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:28:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:28:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:55.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.308 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.715 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:28:56 np0005595445 nova_compute[226322]: 2026-01-26 10:28:56.716 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:28:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:28:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1175763920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.209 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.432 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.435 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.436 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.436 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.519 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.519 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.539 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:28:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:57.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:28:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:28:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:28:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:28:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:28:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2657310510' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.984 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:28:57 np0005595445 nova_compute[226322]: 2026-01-26 10:28:57.991 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:28:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:28:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:28:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:28:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:28:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:28:58 np0005595445 nova_compute[226322]: 2026-01-26 10:28:58.011 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:28:58 np0005595445 nova_compute[226322]: 2026-01-26 10:28:58.013 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:28:58 np0005595445 nova_compute[226322]: 2026-01-26 10:28:58.013 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:28:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:28:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:28:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:28:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3037181311' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:28:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:28:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.002000054s ======
Jan 26 05:28:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:28:59.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Jan 26 05:28:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:28:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:28:59 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:28:59.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.309 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.311 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.312 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.378 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:01 np0005595445 nova_compute[226322]: 2026-01-26 10:29:01.379 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:01.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:01.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:03 np0005595445 podman[250049]: 2026-01-26 10:29:03.305325519 +0000 UTC m=+0.085128075 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 26 05:29:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:03.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:03.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:05.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:05 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:05 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:05 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:05.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:06 np0005595445 nova_compute[226322]: 2026-01-26 10:29:06.380 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:07.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:07 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:07 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:07 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:07.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:07 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:07 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:08 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:08 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 26 05:29:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:29:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:29:09 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 26 05:29:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:09 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:09 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:09 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:09 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:09.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:11 np0005595445 nova_compute[226322]: 2026-01-26 10:29:11.382 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:11 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:11 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:11 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:11 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:12 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:13 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:13 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:29:13 np0005595445 ceph-mon[80107]: from='mgr.14697 192.168.122.100:0/270092481' entity='mgr.compute-0.zllcia' 
Jan 26 05:29:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:13 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:13 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:13.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:13 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:13 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:13.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:15 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:15 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:15.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:15 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:15 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:15.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:16 np0005595445 nova_compute[226322]: 2026-01-26 10:29:16.385 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:17 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:17 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:17 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:17.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:17 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:17.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:17 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:17 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:18 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:18 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:19 np0005595445 podman[250216]: 2026-01-26 10:29:19.288429611 +0000 UTC m=+0.065764556 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 05:29:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:19 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:19 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:19.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:19 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:19 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:19.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:21 np0005595445 nova_compute[226322]: 2026-01-26 10:29:21.386 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:21 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:21 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:21.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:21 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:21 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:21.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:22 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:22 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:23 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:23 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:23 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:23 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:23 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:23 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:23.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:25 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:25 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:25 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:25 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:25.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:26 np0005595445 nova_compute[226322]: 2026-01-26 10:29:26.388 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:27 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab7ee5d0 =====
Jan 26 05:29:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:27 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:27 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab7ee5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:27 np0005595445 radosgw[82065]: beast: 0x7f17ab7ee5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:27 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:27 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:28 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:28 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:29 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:29 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:29 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:29.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.390 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.391 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.391 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.392 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.426 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:31 np0005595445 nova_compute[226322]: 2026-01-26 10:29:31.427 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:31.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:31 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:31 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:31 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:31.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:32 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:32 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:33 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:33 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:33.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:33 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:33 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:33 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:33.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:34 np0005595445 podman[250269]: 2026-01-26 10:29:34.357422892 +0000 UTC m=+0.139612813 container health_status 34bc7cc37f1cab33076a8df44e6e11f5f99517cf14827e1df70a4093b1c3d496 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 26 05:29:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:35.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:35 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:35 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:35 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.428 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.429 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.430 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 26 05:29:36 np0005595445 nova_compute[226322]: 2026-01-26 10:29:36.432 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:37.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:37 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:37 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:37 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:37.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:37 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:37 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:38 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:38 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:39 np0005595445 systemd-logind[783]: New session 58 of user zuul.
Jan 26 05:29:39 np0005595445 systemd[1]: Started Session 58 of User zuul.
Jan 26 05:29:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:39.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:39 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:39 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:39 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:39.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:41 np0005595445 nova_compute[226322]: 2026-01-26 10:29:41.432 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:41.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:41 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:41 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:41 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:41.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 26 05:29:42 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2606455545' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 26 05:29:42 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:42 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:43 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:43 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:43.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:43 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:43 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:43 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:43.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:45 np0005595445 nova_compute[226322]: 2026-01-26 10:29:45.014 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:45 np0005595445 ovs-vsctl[250625]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 05:29:45 np0005595445 nova_compute[226322]: 2026-01-26 10:29:45.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:45.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:45 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:45 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:45 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:45.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:46 np0005595445 nova_compute[226322]: 2026-01-26 10:29:46.433 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:46 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 05:29:46 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 05:29:46 np0005595445 virtqemud[225791]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 05:29:46 np0005595445 nova_compute[226322]: 2026-01-26 10:29:46.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: cache status {prefix=cache status} (starting...)
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: client ls {prefix=client ls} (starting...)
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:47 np0005595445 lvm[250980]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 26 05:29:47 np0005595445 lvm[250980]: VG ceph_vg0 finished
Jan 26 05:29:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:47 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:47 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:47 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:47.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: damage ls {prefix=damage ls} (starting...)
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump loads {prefix=dump loads} (starting...)
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:47 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 26 05:29:47 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1597042348' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 26 05:29:47 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:47 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:48 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:48 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 26 05:29:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3490216551' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:48 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 26 05:29:48 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3430578706' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.686 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.687 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.711 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.711 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:48 np0005595445 nova_compute[226322]: 2026-01-26 10:29:48.712 226326 DEBUG nova.compute.manager [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 26 05:29:48 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:49 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: ops {prefix=ops} (starting...)
Jan 26 05:29:49 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 26 05:29:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3278578878' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 26 05:29:49 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: session ls {prefix=session ls} (starting...)
Jan 26 05:29:49 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk Can't run that command on an inactive MDS!
Jan 26 05:29:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:49 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:49 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 26 05:29:49 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:49.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 26 05:29:49 np0005595445 ceph-mds[85919]: mds.cephfs.compute-1.rbkelk asok_command: status {prefix=status} (starting...)
Jan 26 05:29:49 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 05:29:49 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4111243696' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 05:29:50 np0005595445 podman[251416]: 2026-01-26 10:29:50.268476828 +0000 UTC m=+0.050871920 container health_status 6f8c848e4307d304f878a278218698c8db23d1a286e11426efbd023568a40c98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0a2bdd9ca85c110d360e1b96c9ab7abba927ef726c1f1a03bbbf5758fb36692b-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-801b1b784f52dafb148d21a0255653e5af8897d6a2d80caa5cedea4a3c41eace-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/266995504' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1646311646' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 26 05:29:50 np0005595445 nova_compute[226322]: 2026-01-26 10:29:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:50 np0005595445 nova_compute[226322]: 2026-01-26 10:29:50.686 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2590876303' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 26 05:29:50 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2610141522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/965486565' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 05:29:51 np0005595445 nova_compute[226322]: 2026-01-26 10:29:51.435 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2903462977' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 26 05:29:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:51 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:51 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:51 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:51.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 26 05:29:51 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2842339571' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1562471956' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441896431' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657819002' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 26 05:29:52 np0005595445 nova_compute[226322]: 2026-01-26 10:29:52.687 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:52 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:52 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:53 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:53 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d170b6b400 session 0x55d1721d3a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935394 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 74.881858826s of 74.889221191s, submitted: 2
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77643776 unmapped: 876544 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a000 session 0x55d1720950e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936906 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.996082306s of 18.000623703s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938418 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77684736 unmapped: 835584 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.481588364s of 14.488451958s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77701120 unmapped: 819200 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939339 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77709312 unmapped: 811008 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 794624 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17042a800 session 0x55d172095c20
Jan 26 05:29:53 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 761856 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d17260d0e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 737280 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 938748 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.857223511s of 56.950168610s, submitted: 4
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77848576 unmapped: 671744 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d172e81860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6838 writes, 27K keys, 6838 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6838 writes, 1330 syncs, 5.14 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 522 writes, 785 keys, 522 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s#012Interval WAL: 522 writes, 261 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d16ea37350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77864960 unmapped: 655360 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939669 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 58.666145325s of 58.674335480s, submitted: 2
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941181 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77889536 unmapped: 630784 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread fragmentation_score=0.000026 took=0.000118s
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d171202800 session 0x55d17260c1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 940590 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d172163800 session 0x55d16feeed20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.543655396s of 56.563098907s, submitted: 2
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 939999 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.324966431s of 10.332220078s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.846134186s of 36.849975586s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78290944 unmapped: 229376 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78323712 unmapped: 196608 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78331904 unmapped: 188416 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78340096 unmapped: 180224 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78348288 unmapped: 172032 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78356480 unmapped: 163840 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 37.952041626s of 38.464164734s, submitted: 139
Jan 26 05:29:53 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 26 05:29:53 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2923609229' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78381056 unmapped: 1187840 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78413824 unmapped: 1155072 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78422016 unmapped: 1146880 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78430208 unmapped: 1138688 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 941511 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 52.869400024s of 53.368377686s, submitted: 86
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943023 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.398575783s of 12.402492523s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17254c000 session 0x55d17293a5a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 943944 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.989221573s of 20.182910919s, submitted: 2
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 945456 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78438400 unmapped: 1130496 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78446592 unmapped: 1122304 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78462976 unmapped: 1105920 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 ms_handle_reset con 0x55d17035d000 session 0x55d16feee1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78471168 unmapped: 1097728 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78479360 unmapped: 1089536 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78495744 unmapped: 1073152 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 944865 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 heartbeat osd_stat(store_statfs(0x4fca05000/0x0/0x4ffc00000, data 0x16201e/0x217000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 79.554496765s of 79.564147949s, submitted: 3
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 143 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 78536704 unmapped: 1032192 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 79724544 unmapped: 16629760 heap: 96354304 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 146 ms_handle_reset con 0x55d171202800 session 0x55d17263d4a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 146 heartbeat osd_stat(store_statfs(0x4fb58a000/0x0/0x4ffc00000, data 0x15d8375/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80936960 unmapped: 23814144 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 147 ms_handle_reset con 0x55d17042ac00 session 0x55d172584d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1101069 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 147 heartbeat osd_stat(store_statfs(0x4fb585000/0x0/0x4ffc00000, data 0x15da4a0/0x1695000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80953344 unmapped: 23797760 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80961536 unmapped: 23789568 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80969728 unmapped: 23781376 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102895 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 40.111835480s of 40.450466156s, submitted: 51
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb583000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1104407 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80977920 unmapped: 23773184 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1102976 data_alloc: 218103808 data_used: 176128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 80986112 unmapped: 23764992 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d172151000 session 0x55d172095e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 ms_handle_reset con 0x55d17035d000 session 0x55d172672960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92413952 unmapped: 12337152 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 heartbeat osd_stat(store_statfs(0x4fb584000/0x0/0x4ffc00000, data 0x15dc472/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.857666016s of 12.088842392s, submitted: 2
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 92504064 unmapped: 12247040 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb57c000/0x0/0x4ffc00000, data 0x15e06b1/0x169f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d17019b680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d171202800 session 0x55d172671680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17254c000 session 0x55d172671860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1167114 data_alloc: 234881024 data_used: 11649024
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d172153400 session 0x55d172671a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17035d000 session 0x55d172671c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 heartbeat osd_stat(store_statfs(0x4fb234000/0x0/0x4ffc00000, data 0x19286b1/0x19e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94339072 unmapped: 10412032 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 ms_handle_reset con 0x55d17042ac00 session 0x55d1726714a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170865 data_alloc: 234881024 data_used: 11649024
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 94191616 unmapped: 10559488 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d170188f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97239040 unmapped: 7512064 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fb231000/0x0/0x4ffc00000, data 0x192a683/0x19ea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192885 data_alloc: 234881024 data_used: 14888960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 97329152 unmapped: 7421952 heap: 104751104 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.199676514s of 19.449176788s, submitted: 39
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102621184 unmapped: 4235264 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102588416 unmapped: 4268032 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa917000/0x0/0x4ffc00000, data 0x2245683/0x2305000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274279 data_alloc: 234881024 data_used: 15114240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa876000/0x0/0x4ffc00000, data 0x22e6683/0x23a6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102457344 unmapped: 4399104 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 4210688 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272359 data_alloc: 234881024 data_used: 15114240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa852000/0x0/0x4ffc00000, data 0x230a683/0x23ca000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102244352 unmapped: 4612096 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.935168266s of 12.713699341s, submitted: 84
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272375 data_alloc: 234881024 data_used: 15114240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa847000/0x0/0x4ffc00000, data 0x2315683/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271784 data_alloc: 234881024 data_used: 15114240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102408192 unmapped: 4448256 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102096896 unmapped: 4759552 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172672d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172994000 session 0x55d172896780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17019e000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 102088704 unmapped: 4767744 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1725c54a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273484 data_alloc: 234881024 data_used: 15114240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1725c5680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103366656 unmapped: 3489792 heap: 106856448 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266eb40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.664448738s of 11.864383698s, submitted: 7
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103407616 unmapped: 9748480 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8cd20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d1726721e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d17034ef00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d1703c14a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17289b680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa844000/0x0/0x4ffc00000, data 0x2318683/0x23d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318619 data_alloc: 234881024 data_used: 15642624
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103424000 unmapped: 9732096 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b2000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172661400 session 0x55d16f9e6960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 103055360 unmapped: 10100736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 8486912 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1355120 data_alloc: 234881024 data_used: 20631552
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17019ab40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360592 data_alloc: 234881024 data_used: 21483520
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.041341782s of 14.129011154s, submitted: 19
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108175360 unmapped: 4980736 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2b1000/0x0/0x4ffc00000, data 0x28aa683/0x296a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 4947968 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 4915200 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4fa2af000/0x0/0x4ffc00000, data 0x28ad683/0x296d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 108249088 unmapped: 4907008 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360332 data_alloc: 234881024 data_used: 21483520
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109010944 unmapped: 4145152 heap: 113156096 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 2949120 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 5578752 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113344512 unmapped: 4759552 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1447918 data_alloc: 234881024 data_used: 22425600
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113393664 unmapped: 4710400 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113582080 unmapped: 4521984 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.826783180s of 12.373358727s, submitted: 98
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83e5000/0x0/0x4ffc00000, data 0x31b9683/0x3279000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442638 data_alloc: 234881024 data_used: 22425600
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f83ea000/0x0/0x4ffc00000, data 0x31c2683/0x3282000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d17289a000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d1703c6d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 113565696 unmapped: 4538368 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1730cab40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370749 data_alloc: 234881024 data_used: 18501632
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042ac00 session 0x55d170d8c1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172e48000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f905a000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285633 data_alloc: 234881024 data_used: 15642624
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.467782021s of 12.590607643s, submitted: 41
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9058000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109101056 unmapped: 9003008 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba1860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109109248 unmapped: 8994816 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9283000/0x0/0x4ffc00000, data 0x2327683/0x23e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,1])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 11517952 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1728970e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166733 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.436046600s of 11.544039726s, submitted: 32
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1168245 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a000 session 0x55d172ba0780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172ba1a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172ba1680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172ba14a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ba0d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106594304 unmapped: 11509760 heap: 118104064 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17250dc00 session 0x55d1701881e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.435865402s of 23.439153671s, submitted: 1
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170045 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170188f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170d8c960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170d8c000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d170d8c5a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993000 session 0x55d16f9e6960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d16f9e7a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d16f9e6000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193785 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d16f9e65a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172ee6b40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106414080 unmapped: 13860864 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106422272 unmapped: 13852672 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212814 data_alloc: 234881024 data_used: 14893056
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107012096 unmapped: 13262848 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 107020288 unmapped: 13254656 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.067176819s of 19.197723389s, submitted: 13
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10002432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9d30000/0x0/0x4ffc00000, data 0x187b693/0x193c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271222 data_alloc: 234881024 data_used: 15110144
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109248512 unmapped: 11026432 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9553000/0x0/0x4ffc00000, data 0x2052693/0x2113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1278384 data_alloc: 234881024 data_used: 14929920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109887488 unmapped: 10387456 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9539000/0x0/0x4ffc00000, data 0x2064693/0x2125000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109895680 unmapped: 10379264 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096400 session 0x55d17263d2c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.033960342s of 30.179452896s, submitted: 56
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1730ca5a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 109903872 unmapped: 10371072 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172670000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1176888 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106717184 unmapped: 13557760 heap: 120274944 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172effe00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1726e9680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c000 session 0x55d172b9e1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172e81c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.123188019s of 25.234869003s, submitted: 31
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 116154368 unmapped: 21454848 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172672780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d172896f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d170400d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172997800 session 0x55d170400000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1703e32c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1292908 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170047680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 106971136 unmapped: 30638080 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118013952 unmapped: 19595264 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1402500 data_alloc: 251658240 data_used: 28434432
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9028000/0x0/0x4ffc00000, data 0x2584683/0x2644000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118046720 unmapped: 19562496 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.038774490s of 20.122617722s, submitted: 17
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 17965056 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1445412 data_alloc: 251658240 data_used: 28798976
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 15523840 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122789888 unmapped: 14819328 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a3b000/0x0/0x4ffc00000, data 0x2b71683/0x2c31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 14786560 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 14753792 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122863616 unmapped: 14745600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8201 writes, 31K keys, 8201 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 8201 writes, 1912 syncs, 4.29 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1363 writes, 4432 keys, 1363 commit groups, 1.0 writes per commit group, ingest: 4.48 MB, 0.01 MB/s#012Interval WAL: 1363 writes, 582 syncs, 2.34 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2b000/0x0/0x4ffc00000, data 0x2b81683/0x2c41000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1458516 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122896384 unmapped: 14712832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d173144f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172037a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730ca5a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1730caf00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.439634323s of 18.733009338s, submitted: 55
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172896960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172ee7c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d170188960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d172ee6b40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1476366 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87cb000/0x0/0x4ffc00000, data 0x2de06e5/0x2ea1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 15179776 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259ab40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 16023552 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1481927 data_alloc: 251658240 data_used: 30072832
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 15736832 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494239 data_alloc: 251658240 data_used: 31916032
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123502592 unmapped: 14106624 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87ca000/0x0/0x4ffc00000, data 0x2de0708/0x2ea2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123510784 unmapped: 14098432 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.753458977s of 13.886870384s, submitted: 41
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f87c8000/0x0/0x4ffc00000, data 0x2de1708/0x2ea3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1494831 data_alloc: 251658240 data_used: 31920128
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 14065664 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131710976 unmapped: 5898240 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 133734400 unmapped: 3874816 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7899000/0x0/0x4ffc00000, data 0x3d11708/0x3dd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1622323 data_alloc: 251658240 data_used: 33873920
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7872000/0x0/0x4ffc00000, data 0x3d38708/0x3dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132063232 unmapped: 5545984 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.394704819s of 10.849593163s, submitted: 125
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 132079616 unmapped: 5529600 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d172670000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720374a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f786f000/0x0/0x4ffc00000, data 0x3d3b708/0x3dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1703c65a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b826a6/0x2c43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 7643136 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1463639 data_alloc: 251658240 data_used: 29515776
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d170047c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17259b680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 7634944 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d172671860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f842a000/0x0/0x4ffc00000, data 0x2b82683/0x2c42000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197707 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 114851840 unmapped: 22757376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.995807648s of 35.219715118s, submitted: 71
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d2000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202800 session 0x55d170401c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170c97a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c54a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d1703c4780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254d000 session 0x55d1726e8960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218652 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 22437888 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 22233088 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9e2d000/0x0/0x4ffc00000, data 0x177e6e5/0x183f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222452 data_alloc: 234881024 data_used: 12705792
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 19.421459198s of 19.482046127s, submitted: 29
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 22224896 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117194752 unmapped: 20414464 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119021568 unmapped: 18587648 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 17473536 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120143872 unmapped: 17465344 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120152064 unmapped: 17457152 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268948 data_alloc: 234881024 data_used: 13053952
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120160256 unmapped: 17448960 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9898000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120168448 unmapped: 17440768 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.392076492s of 32.704330444s, submitted: 79
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 18792448 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262116 data_alloc: 234881024 data_used: 13058048
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118939648 unmapped: 18669568 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118947840 unmapped: 18661376 heap: 137609216 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fde00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1721fc3c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721fd680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266f680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fa40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266e960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17266f0e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266e000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d17266e780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a7000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119488512 unmapped: 25468928 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302588 data_alloc: 234881024 data_used: 13058048
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.978124619s of 10.595539093s, submitted: 169
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 25534464 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169c00 session 0x55d17266fc20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1304100 data_alloc: 234881024 data_used: 13041664
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266f4a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119439360 unmapped: 25518080 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f5000/0x0/0x4ffc00000, data 0x21b66e5/0x2277000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17266e1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119447552 unmapped: 25509888 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d173145e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303553 data_alloc: 234881024 data_used: 13041664
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 25501696 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1314801 data_alloc: 234881024 data_used: 14536704
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d172659680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f93f4000/0x0/0x4ffc00000, data 0x21b6708/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1335625 data_alloc: 234881024 data_used: 17674240
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119128064 unmapped: 25829376 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.355880737s of 25.392654419s, submitted: 18
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8a2c000/0x0/0x4ffc00000, data 0x2b76708/0x2c38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172162400 session 0x55d1730ca1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 22511616 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409087 data_alloc: 234881024 data_used: 17907712
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f899e000/0x0/0x4ffc00000, data 0x2c0c708/0x2cce000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [0,0,0,0,0,0,0,0,0,0,0,1,1])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122363904 unmapped: 22593536 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122634240 unmapped: 22323200 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122855424 unmapped: 22102016 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122871808 unmapped: 22085632 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 22200320 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1419099 data_alloc: 234881024 data_used: 17899520
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122822656 unmapped: 22134784 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 4.809408665s of 10.088632584s, submitted: 133
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122904576 unmapped: 22052864 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122912768 unmapped: 22044672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f891a000/0x0/0x4ffc00000, data 0x2c90708/0x2d52000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f88f9000/0x0/0x4ffc00000, data 0x2cb1708/0x2d73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 22036480 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420491 data_alloc: 234881024 data_used: 17899520
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122929152 unmapped: 22028288 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721d2000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172169800 session 0x55d1703c61e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 122937344 unmapped: 22020096 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259b680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273109 data_alloc: 234881024 data_used: 13041664
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f98a6000/0x0/0x4ffc00000, data 0x1d046e5/0x1dc5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.456459999s of 13.937705994s, submitted: 67
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1703c6960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d1725841e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 24018944 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1272745 data_alloc: 234881024 data_used: 13041664
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1726701e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119382016 unmapped: 25575424 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 25567232 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 25559040 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9b6d000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 25550848 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213577 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 46.250705719s of 46.323696136s, submitted: 22
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c4f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170d8dc20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d17263c780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d170d8d2c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d172895860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224861 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 25116672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17254c400 session 0x55d17263d680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1730cbc20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 119824384 unmapped: 25133056 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d170046780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1721fc780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118849536 unmapped: 26107904 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118734848 unmapped: 26222592 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1229754 data_alloc: 234881024 data_used: 12795904
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9eff000/0x0/0x4ffc00000, data 0x16ad683/0x176d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 26214400 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1720954a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.380515099s of 11.418202400s, submitted: 13
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d17263c960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17266fe00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118751232 unmapped: 26206208 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: mgrc ms_handle_reset ms_handle_reset con 0x55d171202c00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/2891176105
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/2891176105,v1:192.168.122.100:6801/2891176105]
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: mgrc handle_mgr_configure stats_period=5
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172167c00 session 0x55d1725c45a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172eff4a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216030 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 26140672 heap: 144957440 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fca000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.299061775s of 15.430803299s, submitted: 25
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d172ee72c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1721fc780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d1703c5e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172494400 session 0x55d1728943c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172895860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955c000/0x0/0x4ffc00000, data 0x2050683/0x2110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17042a800 session 0x55d1703c70e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118333440 unmapped: 36143104 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164000 session 0x55d17259af00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118349824 unmapped: 36126720 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172164400 session 0x55d17259a000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726701e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f955a000/0x0/0x4ffc00000, data 0x20506b6/0x2112000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117383168 unmapped: 37093376 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301098 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117170176 unmapped: 37306368 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123084800 unmapped: 31391744 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1703c14a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17293a780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.107300758s of 10.256335258s, submitted: 30
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117915648 unmapped: 36560896 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d16f7d4000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117932032 unmapped: 36544512 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117940224 unmapped: 36536320 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1225121 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d17266fe00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173145c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d173144960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172659680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 36528128 heap: 154476544 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 30.658174515s of 30.887229919s, submitted: 38
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1731441e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172095c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d172897680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172993800 session 0x55d170c97a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172894d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293569 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172894b40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117514240 unmapped: 38584320 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528000 session 0x55d172896960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172095860
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117555200 unmapped: 38543360 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172ee70e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f80683/0x2040000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117858304 unmapped: 38240256 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320800 data_alloc: 234881024 data_used: 15335424
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120201216 unmapped: 35897344 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9606000/0x0/0x4ffc00000, data 0x1fa46b6/0x2066000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359864 data_alloc: 234881024 data_used: 21143552
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 120209408 unmapped: 35889152 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.557689667s of 17.637289047s, submitted: 14
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 32169984 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124157952 unmapped: 31940608 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f919b000/0x0/0x4ffc00000, data 0x240f6b6/0x24d1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 31776768 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124329984 unmapped: 31768576 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417424 data_alloc: 234881024 data_used: 21848064
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9169000/0x0/0x4ffc00000, data 0x24406b6/0x2502000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1420008 data_alloc: 234881024 data_used: 22011904
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124395520 unmapped: 31703040 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172036f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172585e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9167000/0x0/0x4ffc00000, data 0x24436b6/0x2505000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.894859314s of 12.680984497s, submitted: 52
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e80b40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 38658048 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117448704 unmapped: 38649856 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1237102 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117456896 unmapped: 38641664 heap: 156098560 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.971050262s of 24.083293915s, submitted: 28
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae9000 session 0x55d17311c1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172897a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172095c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d1726e83c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172efed20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117735424 unmapped: 46235648 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 46227456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d173144960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172163400 session 0x55d1725845a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117751808 unmapped: 46219264 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 117760000 unmapped: 46211072 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353511 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9045000/0x0/0x4ffc00000, data 0x2567683/0x2627000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1721d3a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172e81e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118374400 unmapped: 45596672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 118382592 unmapped: 45588480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 123191296 unmapped: 40779776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128458752 unmapped: 35512320 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128491520 unmapped: 35479552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128499712 unmapped: 35471360 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1468282 data_alloc: 251658240 data_used: 28352512
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9020000/0x0/0x4ffc00000, data 0x258b693/0x264c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128532480 unmapped: 35438592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128548864 unmapped: 35422208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172ae8c00 session 0x55d170d8d0e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d170d8dc20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 20.590911865s of 20.736030579s, submitted: 31
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d170401e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172673680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172672f00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d1726730e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172c29800 session 0x55d17034e1e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d172672d20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d16f9e63c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137723904 unmapped: 26247168 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1635802 data_alloc: 251658240 data_used: 29327360
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e20000/0x0/0x4ffc00000, data 0x3781705/0x3844000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138059776 unmapped: 25911296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 138067968 unmapped: 25903104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d17259b680
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 136216576 unmapped: 27754496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1632727 data_alloc: 251658240 data_used: 29339648
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137707520 unmapped: 26263552 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139845632 unmapped: 24125440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 139853824 unmapped: 24117248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7e04000/0x0/0x4ffc00000, data 0x37a5705/0x3868000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1686383 data_alloc: 251658240 data_used: 34185216
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.407573700s of 17.746696472s, submitted: 131
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 140615680 unmapped: 23355392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143704064 unmapped: 20267008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f737e000/0x0/0x4ffc00000, data 0x4223705/0x42e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143548416 unmapped: 20422656 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1784443 data_alloc: 251658240 data_used: 35086336
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143171584 unmapped: 20799488 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143179776 unmapped: 20791296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7304000/0x0/0x4ffc00000, data 0x42a5705/0x4368000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143212544 unmapped: 20758528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1785315 data_alloc: 251658240 data_used: 35086336
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143245312 unmapped: 20725760 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786531 data_alloc: 251658240 data_used: 35164160
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.249229431s of 14.751939774s, submitted: 97
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143269888 unmapped: 20701184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f72e3000/0x0/0x4ffc00000, data 0x42c6705/0x4389000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 142999552 unmapped: 20971520 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17118a400 session 0x55d16f7d4780
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 143065088 unmapped: 20905984 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1786215 data_alloc: 251658240 data_used: 35164160
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1704003c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f7f93000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137355264 unmapped: 26615808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1579255 data_alloc: 234881024 data_used: 26243072
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.313569069s of 11.179004669s, submitted: 65
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172ee61e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d17311c000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137371648 unmapped: 26599424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f81e3000/0x0/0x4ffc00000, data 0x2fb8693/0x3079000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 137388032 unmapped: 26583040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 38322176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275770 data_alloc: 234881024 data_used: 12288000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [0,0,1])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172eff4a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1269126 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172097800 session 0x55d172efe3c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d1726725a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d1726590e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172659a40
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 38297600 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 25.249382019s of 28.619909286s, submitted: 55
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1371866 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e80960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d17311c3c0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172528c00 session 0x55d172585c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17259af00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172096800 session 0x55d172894000
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352162 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 38215680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9067000/0x0/0x4ffc00000, data 0x2135683/0x21f5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172523000 session 0x55d172673e00
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 38158336 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432927 data_alloc: 234881024 data_used: 24039424
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 128794624 unmapped: 35176448 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9066000/0x0/0x4ffc00000, data 0x21356a6/0x21f6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.371334076s of 18.521051407s, submitted: 16
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f50000/0x0/0x4ffc00000, data 0x22456a6/0x2306000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 130768896 unmapped: 33202176 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1444671 data_alloc: 234881024 data_used: 24260608
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131121152 unmapped: 32849920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1457483 data_alloc: 234881024 data_used: 24080384
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172dc3c00 session 0x55d172e810e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d171202000 session 0x55d1726701e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f8f2d000/0x0/0x4ffc00000, data 0x22666a6/0x2327000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131129344 unmapped: 32841728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 32833536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.647443771s of 15.735019684s, submitted: 39
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280707 data_alloc: 234881024 data_used: 12181504
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e26a6/0x16a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d16f7c1400 session 0x55d17311cd20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124674048 unmapped: 39297024 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf dump' '{prefix=perf dump}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 39133184 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf schema' '{prefix=perf schema}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 39534592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124436480 unmapped: 39534592 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124444672 unmapped: 39526400 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124452864 unmapped: 39518208 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124461056 unmapped: 39510016 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 39501824 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124477440 unmapped: 39493632 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s#012Cumulative WAL: 10K writes, 2923 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2426 writes, 8573 keys, 2426 commit groups, 1.0 writes per commit group, ingest: 9.03 MB, 0.02 MB/s#012Interval WAL: 2426 writes, 1011 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124485632 unmapped: 39485440 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 234881024 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124493824 unmapped: 39477248 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124502016 unmapped: 39469056 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124510208 unmapped: 39460864 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124518400 unmapped: 39452672 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124526592 unmapped: 39444480 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124534784 unmapped: 39436288 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124542976 unmapped: 39428096 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124551168 unmapped: 39419904 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124559360 unmapped: 39411712 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124575744 unmapped: 39395328 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280151 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bb9000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124583936 unmapped: 39387136 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 243.777282715s of 244.093200684s, submitted: 27
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124600320 unmapped: 39370752 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124608512 unmapped: 39362560 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124731392 unmapped: 39239680 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124747776 unmapped: 39223296 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124755968 unmapped: 39215104 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124772352 unmapped: 39198720 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.141582489s of 36.681442261s, submitted: 145
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124780544 unmapped: 39190528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124796928 unmapped: 39174144 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124911616 unmapped: 39059456 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17035d000 session 0x55d170400960
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124952576 unmapped: 39018496 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 39010304 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124968960 unmapped: 39002112 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124977152 unmapped: 38993920 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124985344 unmapped: 38985728 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 38977536 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125001728 unmapped: 38969344 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125009920 unmapped: 38961152 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125018112 unmapped: 38952960 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 38944768 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 38936576 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125042688 unmapped: 38928384 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125050880 unmapped: 38920192 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125059072 unmapped: 38912000 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125067264 unmapped: 38903808 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125075456 unmapped: 38895616 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125083648 unmapped: 38887424 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125091840 unmapped: 38879232 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125100032 unmapped: 38871040 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125108224 unmapped: 38862848 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125124608 unmapped: 38846464 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 38838272 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125140992 unmapped: 38830080 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125149184 unmapped: 38821888 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 38813696 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125165568 unmapped: 38805504 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 38789120 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 38780928 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 38772736 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172166800 session 0x55d1720365a0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 38756352 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 38748160 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 38739968 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 38731776 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 38723584 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 38715392 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 38707200 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 38699008 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 38690816 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 38682624 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 38674432 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125304832 unmapped: 38666240 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125313024 unmapped: 38658048 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125321216 unmapped: 38649856 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125321216 unmapped: 38649856 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d17116ac00 session 0x55d1731450e0
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 ms_handle_reset con 0x55d172507800 session 0x55d172ee7c20
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125329408 unmapped: 38641664 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125329408 unmapped: 38641664 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125337600 unmapped: 38633472 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125353984 unmapped: 38617088 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125362176 unmapped: 38608896 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125362176 unmapped: 38608896 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: osd.1 151 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15e2683/0x16a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,2] op hist [])
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279859 data_alloc: 218103808 data_used: 12177408
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125370368 unmapped: 38600704 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}'
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125444096 unmapped: 38526976 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 38166528 heap: 163971072 old mem: 2845415833 new mem: 2845415833
Jan 26 05:29:53 np0005595445 ceph-osd[77632]: do_command 'log dump' '{prefix=log dump}'
Jan 26 05:29:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:53.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:53 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:53 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:53 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:53.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:29:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:29:53 np0005595445 ovn_metadata_agent[143321]: 2026-01-26 10:29:53.957 143326 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3975494299' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2206345920' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 05:29:54 np0005595445 nova_compute[226322]: 2026-01-26 10:29:54.681 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 05:29:54 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2249248986' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.363671) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395363729, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1344, "num_deletes": 250, "total_data_size": 3092090, "memory_usage": 3120232, "flush_reason": "Manual Compaction"}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395371048, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1272752, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42093, "largest_seqno": 43432, "table_properties": {"data_size": 1267847, "index_size": 2173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14079, "raw_average_key_size": 21, "raw_value_size": 1256855, "raw_average_value_size": 1948, "num_data_blocks": 95, "num_entries": 645, "num_filter_entries": 645, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769423294, "oldest_key_time": 1769423294, "file_creation_time": 1769423395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 7407 microseconds, and 3313 cpu microseconds.
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.371083) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1272752 bytes OK
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.371101) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372676) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372690) EVENT_LOG_v1 {"time_micros": 1769423395372687, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.372730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 3085468, prev total WAL file size 3085468, number of live WAL files 2.
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.373442) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323530' seq:72057594037927935, type:22 .. '6D6772737461740031353031' seq:0, type:0; will stop at (end)
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1242KB)], [81(14MB)]
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395373489, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 16081373, "oldest_snapshot_seqno": -1}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 7102 keys, 12714908 bytes, temperature: kUnknown
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395430867, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 12714908, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12671863, "index_size": 24181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17797, "raw_key_size": 187043, "raw_average_key_size": 26, "raw_value_size": 12548292, "raw_average_value_size": 1766, "num_data_blocks": 942, "num_entries": 7102, "num_filter_entries": 7102, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769420465, "oldest_key_time": 0, "file_creation_time": 1769423395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4e64dd99-608f-448d-a4f8-af05bb4d42d8", "db_session_id": "OSRMBNXDC8EXU3R2EA69", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.431113) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 12714908 bytes
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.450818) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 279.8 rd, 221.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 14.1 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(22.6) write-amplify(10.0) OK, records in: 7570, records dropped: 468 output_compression: NoCompression
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.450853) EVENT_LOG_v1 {"time_micros": 1769423395450840, "job": 50, "event": "compaction_finished", "compaction_time_micros": 57473, "compaction_time_cpu_micros": 23764, "output_level": 6, "num_output_files": 1, "total_output_size": 12714908, "num_input_records": 7570, "num_output_records": 7102, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395451238, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769423395453451, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.373374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453503) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453505) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453507) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: rocksdb: (Original Log Time 2026/01/26-10:29:55.453509) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3093970821' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 26 05:29:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:55.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:55 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:55 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:55 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:55.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 26 05:29:55 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/288116703' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4002871036' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4191693100' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.436 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.685 226326 DEBUG oslo_service.periodic_task [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1771140372' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.708 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 26 05:29:56 np0005595445 nova_compute[226322]: 2026-01-26 10:29:56.709 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 26 05:29:56 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2958203862' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2765354709' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1027645102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.172 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.321 226326 WARNING nova.virt.libvirt.driver [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.322 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4517MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.323 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.323 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.387 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.388 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3989524323' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.417 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1625537296' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 26 05:29:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:57.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:57 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:57 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:29:57 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:57.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3510057928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2644301426' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.891 226326 DEBUG oslo_concurrency.processutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.897 226326 DEBUG nova.compute.provider_tree [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed in ProviderTree for provider: d06842a0-5d13-4573-bb78-d433bbb380e4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 26 05:29:57 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1763956724' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.926 226326 DEBUG nova.scheduler.client.report [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Inventory has not changed for provider d06842a0-5d13-4573-bb78-d433bbb380e4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.928 226326 DEBUG nova.compute.resource_tracker [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 26 05:29:57 np0005595445 nova_compute[226322]: 2026-01-26 10:29:57.928 226326 DEBUG oslo_concurrency.lockutils [None req-ba7d7d2b-4eba-4a3c-82bc-401d9e169cf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 26 05:29:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:29:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:29:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:57 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:29:58 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:29:58 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/844049035' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 26 05:29:58 np0005595445 systemd[1]: Starting Hostname Service...
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/181875530' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 26 05:29:58 np0005595445 systemd[1]: Started Hostname Service.
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2958399404' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3155950399' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 26 05:29:58 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4061816220' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3651764905' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2654257985' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/44222642' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 26 05:29:59 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2665981540' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 26 05:29:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:29:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:29:59 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:29:59 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:29:59 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:29:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:30:00 np0005595445 ceph-mon[80107]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Jan 26 05:30:00 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 26 05:30:00 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/367892269' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 26 05:30:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 26 05:30:01 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1413975470' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 26 05:30:01 np0005595445 nova_compute[226322]: 2026-01-26 10:30:01.438 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:30:01 np0005595445 nova_compute[226322]: 2026-01-26 10:30:01.441 226326 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 26 05:30:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:30:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 26 05:30:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:30:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 26 05:30:01 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:30:01 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:30:01 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:30:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:30:01 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 26 05:30:01 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2067073806' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1558495807' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:30:02 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 05:30:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:02 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 05:30:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 26 05:30:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 26 05:30:03 np0005595445 ceph-1a70b85d-e3fd-5814-8a6a-37ea00fcae30-nfs-cephfs-0-0-compute-1-thyhvc[230880]: 26/01/2026 10:30:03 : epoch 69773d03 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 26 05:30:03 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1342520033' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 26 05:30:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:30:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:30:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.102 - anonymous [26/Jan/2026:10:30:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:30:03 np0005595445 radosgw[82065]: ====== starting new request req=0x7f17ab86f5d0 =====
Jan 26 05:30:03 np0005595445 radosgw[82065]: ====== req done req=0x7f17ab86f5d0 op status=0 http_status=200 latency=0.001000027s ======
Jan 26 05:30:03 np0005595445 radosgw[82065]: beast: 0x7f17ab86f5d0: 192.168.122.100 - anonymous [26/Jan/2026:10:30:03.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 26 05:30:04 np0005595445 ceph-mon[80107]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 26 05:30:04 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/186538420' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 26 05:30:04 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 26 05:30:04 np0005595445 ceph-mon[80107]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
